[HN Gopher] Counterexamples in Type Systems
___________________________________________________________________
Counterexamples in Type Systems
Author : tempodox
Score : 155 points
Date : 2021-05-23 06:40 UTC (1 days ago)
(HTM) web link (counterexamples.org)
(TXT) w3m dump (counterexamples.org)
| tempodox wrote:
| This is an excellent resource for learning about the finer points
| of type systems, precisely because it shows their potential
| breaking points an how things might go awry.
| valenterry wrote:
| Great list!
|
| It shows that is important to take soundness of type-systems
| serious. I like what they did with Scala: they boiled down the
| language into the fundamental parts and built a language from it
| that has a sound type-system (machine-checked). This has now
| finally become Scala 3.
|
| Here are some more details: https://www.scala-
| lang.org/blog/2016/02/03/essence-of-scala....
| virtualwhys wrote:
| Cries in removal of abstract types (e.g. `val x: A#T = ...`).
|
| While Scala 3 filled a soundness hole, it opened a _practical_
| crater (and no, path dependent types are not a viable
| substitute).
|
| Too bad, many nice improvements overall.
| slver wrote:
| > It shows that is important to take soundness of type-systems
| serious.
|
| Does it show that? I see it more like a set of examples
| demonstrating there's no such thing as "a sound type system".
| antihero wrote:
| Is the syntax highlighting broken for anyone else?
|
| For me it is pure black on both Firefox and Chrome
| giraffe333 wrote:
| Is it this issue for Firefox?
| https://bugzilla.mozilla.org/show_bug.cgi?id=1520157
|
| See also https://github.com/adobe-fonts/source-code-
| pro/issues/250
| bob1029 wrote:
| I feel like we don't really need highly-evolved type systems for
| practical problem solving. On the contrary, I have found that the
| plethora of available ways in which to model a problem domain can
| cause more harm than if a more constrained set of tools were made
| available.
|
| What problem domain is so complex that it cannot at least be
| modeled in SQL (or Excel)? What language does not provide
| sufficient abstractions for tables, columns, relations & basic
| data types? Why would you want to take a clean domain model that
| the business agrees upon and then start perverting it with things
| like polymorphism? Performance is often something thrown around
| as an excuse to de-normalize schemas, but typically when I ask
| someone to _actually measure it_ we find that LINQ /SQL/etc is
| more than fast enough to get the job done for all _practical_
| intents & purposes.
| andrewflnr wrote:
| Types like this mostly aren't about data modeling more
| complicated than basic sum and product types (SQL only poorly
| models discriminated sum types, eg ML's data or Rust's enum,
| which is one of my biggest gripes about it). They're about
| correctness, making invalid states inexpressible, and the like.
| Type sums and products (e.g. tuples/structs) really are a nice
| minimal but expressive system.
|
| That said:
|
| > What language does not provide sufficient abstractions for
| tables, columns, relations & basic data types?
|
| A key part of the SQL data model is the primary key. I'd really
| love to see a programming language make PKs or something like
| them a first-class construct. So if we're looking for something
| neat to add to PLs that would actually help with business
| modeling, maybe we could start there?
| bob1029 wrote:
| > A key part of the SQL data model is the primary key. I'd
| really love to see a programming language make PKs or
| something like them a first-class construct.
|
| This certainly seems like an excellent place to start. Every
| single one of my domain types starts like:
| public class MySpecialType { public Guid Id {
| get; set; } = Guid.NewGuid(); //... }
|
| Having some notion of a first class identity that I could
| leverage without having to explicitly add a public Id
| property would be worth looking at.
|
| But, what about the relational aspect? Identity is trivial to
| solve IMO. How do we canonicalize this idea of relations
| _between_ types? The way I do this today is:
| public class MyRelatedType { public Guid Id {
| get; set; } = Guid.NewGuid(); public Guid?
| MySpecialTypeId { get; set; } //... }
|
| And then I just tie it all together with LINQ statements as
| appropriate.
|
| Is there a better way to express this idea with some shiny
| new language constructs? I feel like I am running out of
| physical lines of code to golf with here.
|
| The only thing I can think of that is more concise than my
| above code examples would be SQL.
| siraben wrote:
| I study type systems and programming language theory at
| university, and I encourage friends studying CS to at least have
| some understanding of type theory, such as well-known type
| systems (simply-typed, System F) and properties like soundness or
| uniqueness of typing. Why? Because as programmers we argue to
| death comparing the expressiveness and properties of type systems
| of our favorite programming languages, or we're picking up a
| language with features like traits or type inference. Type theory
| really lets you learn the heart of such features in a small,
| idealistic setting and specifies things rigorously and uniformly,
| (since languages tend to come up with their own names and
| examples for things like ad-hoc polymorphism, AKA
| typeclasses/traits/protocols/concepts), and they appear no matter
| the paradigm.
| chestervonwinch wrote:
| What is a good an introductory text (or other resource) for
| learning? What's the relationship between programming language
| theory and type theory? Is type theory a special topic, or
| would it be considered a core topic in programming language
| theory (e.g., like integration in calculus)?
| cle wrote:
| I love "Types and Programming Languages" by Benjamin Pierce
| and always recommend it for people who want to dip their toes
| into formal type theory. To me it's basically the K&R for
| type theory.
| siraben wrote:
| As others have said, I also recommend Types and Programming
| Languages.
|
| > What's the relationship between programming language theory
| and type theory?
|
| I'd say programming language theory is more broad than type
| theory, for instance topics such as compilation techniques
| and runtime systems are more distant from type theory (though
| you might prove some type-theoretic thing like type
| preservation).
|
| There's also a range of type theory books because it's a
| field that spans pure logic to programming languages, so you
| can find books like Type Theory and Functional Programming by
| Thompson that elaborate things like dependent types early on.
| pansa2 wrote:
| > _What is a good an introductory text (or other resource)
| for learning?_
|
| The usual recommendation seems to be "Types and Programming
| Languages" by Benjamin C. Pierce.
| codeismath wrote:
| I found the first 6 chapters of "Type Theory and Formal
| Proof" to be amazing for covering the beginning math behind
| Type Theory. The authors (Nederpelt and Geuvers) don't gloss
| over assumptions and they build things up step by step. I
| didn't study this in university but have picked up many books
| along the way. I'm usually quite lost within the first 10 or
| 20 pages until picking up Type Theory and Formal Proof. This
| one took me somewhere between 100 to 200 pages before getting
| lost. They cover the entire lambda cube.
|
| But then I was able to re-start Thompson's "Type Theory and
| Functional Programming" and it was clear that I started to
| understand a lot of it.
|
| If you enjoy the nostalgia of older books like me, then pick
| up an original copy of "Intuitionistic Type Theory" edited by
| Per Martin-Lof.
| duped wrote:
| The notation is an insurmountable barrier to many.
| agumonkey wrote:
| ah notation is such a subtle and often sad topic
|
| when you click you never give a damn about notation but
| before you do it's such a drag
|
| it's like monads
| steego wrote:
| It's interesting how calling it insurmountable both panders
| to and insults the people you're talking about.
|
| Maybe a better way to describe the notation is "off-putting"?
| duped wrote:
| It's not supposed to be insulting. I think the notation is
| more than offputting, and describing it as a barrier to
| entry is appropriate.
|
| Like many kinds of formal notation it almost requires a
| class in a university to have it explained. The same is
| true of other fields dense in notation where you need to
| learn the conventions as much as the basics.
| cnidaria wrote:
| I don't get it, do you feel like you need a class in a
| university to learn programming? Source code is nothing
| but formal notation. The fact that it kind of looks like
| English text if you squint makes no difference, as far as
| I'm concerned. There's only a handful of symbols you have
| to learn to read type theory stuff.
| BlackFingolfin wrote:
| In my experience, any sufficiently deep study of any
| subject requires introduction of at least some degree of
| specialized terminology ("jargon") to be able to express
| things precisely _and_ concisely. Sure, you can try to
| avoid this and use just "generic" language, but you
| almost always have to sacrifice one or both of these, at
| least partially. That can very well be worth it in
| certain cases (e.g. in an introductory text). But the
| site being discussed here is a list of specific
| (counter)examples. It would seem counterproductive to me
| to "dumb it down" given that it's primary purpose seems
| to be to discuss very specific situations; indeed, one
| common feature among many of these examples are subtle
| differences in what things mean in different parts of a
| language specification -- i.e., one of the reasons why
| domain experts invariably start using jargon.
|
| That said, of course jargon often is als abused, or used
| in cases it doesn't have to be, etc. -- that's s bad! And
| many papers and books suffer from this and would be
| improved by using less jargon.
|
| But in general I find this negative stance against formal
| notation, and the expectation that one should be able to
| dive into subtle and advanced examples without a need of
| at least some "studying" (not at all necessarily at a
| university!) quite odd, and unrealistic.
| klibertp wrote:
| My main question here is: why is conciseness a good
| thing? Precision is a must, no argument here, but
| conciseness, to me, seems like a premature, and harmful,
| optimization. Is it _really_ that much harder to stay
| within standard ASCII(EDIT: I mean "stay within the
| characters that the reader already knows how to read",
| which on the second thought doesn't have to be ASCII at
| all...)? Do you really need to express concepts with a
| single strange character? What does it buy you, given
| that the downsides for readability are quite obvious?
| There are arguments that using a full word for something
| would be confusing due to other/overloaded meanings of
| the word, but then I go read some papers and I see the
| same ideograms used in drastically different meanings,
| too.
|
| Jargon is ok. Jargon is what happens when you want to be
| precise while re-using brain machinery for reading. The
| exotic notations are, on the other hand, just jargon put
| in a form which you have to (re)learn how to read, for
| the sake of conciseness alone. That conciseness may pay
| off if you're going to be working with the notation
| 8h/day for 10 years (or a year, or maybe even half a
| year), but if you just want to read and understand
| concepts and learn facts... To paraphrase: you wanted a
| banana, but what you got was gorilla holding the banana
| and the whole jungle along with it.
|
| Hence my question: isn't creating exotic, one-character
| == 2.5 concepts, notations simply optimization which is
| harmful to the vast majority of potential readers?
| Twisol wrote:
| > Do you really need to express concepts with a single
| strange character? What does it buy you, given that the
| downsides for readability are quite obvious?
|
| As someone who dabbles in the field, I'll bite. I find
| that more succinct expressions make their structure more
| apparent than less succinct expressions. This is critical
| for identifying (and then demonstrating) the general
| principles underlying any particular example, which is a
| great deal of what we do in theoretical research. (It's
| usually better to describe a whole family of solutions
| than to describe a single solution; and failing that, to
| indicate some directions potentially leading to general
| principles.)
|
| What do I mean by structure? This is a bit of a dodge,
| but structure is what you get when you remove all the
| data, all the aspects that pin an expression to a
| specific scenario instead of a general pattern. If you
| can recognize and memorize the pattern, you can apply it
| to a much broader class of examples than the one you
| learned it from. Concise notation is one tool for
| downplaying the concrete data and calling out the common
| aspects more deliberately.
|
| As other posters have said, this can be abused. I'd even
| say it's a very rare paper that uses concise notation
| appropriately to its fullest extent as an aid to the
| reader. But the concision does serve an important purpose
| to experts in the field (who need rather less aid): it
| makes the newly contributed patterns more visible (and
| uses existing patterns to effectively _de_ -emphasize
| parts).
|
| > but if you just want to read and understand concepts
| and learn facts...
|
| Most research papers do not have as a goal for a
| technical reader to understand concepts and learn facts.
| (There are certainly visible exceptions, see just about
| anything written by Simon Peyton Jones.) Research papers
| are evidence of progress at the forefront of human
| knowledge; they're shared amongst an expert community to
| help drive that whole community forward.
|
| We absolutely need more effort to distill research and
| collect it into a more cohesive picture. Unfortunately,
| that responsibility does not (and probably cannot, in the
| current system) fall on the original researchers
| themselves. There are a few organized efforts out there
| for some fields; the one I know of is Distill.pub, for
| machine learning: https://distill.pub/about/
|
| >> When we rush papers out the door to meet conference
| deadlines, something suffers -- often it is the
| readability and clarity of our communication. This can
| add severe drag to the entire community as our readers
| struggle to understand our ideas. We think this "research
| debt" can be avoided.
|
| But I don't think the concision of notation is at fault.
| It's just the most obvious roadblock to a (relatively)
| lay reader. The truth is simply that the paper wasn't
| written with you in mind.
| AnimalMuppet wrote:
| This is about programming, but the "theory" part of type
| theory is an area of mathematics. Mathematics has always
| operated this way, going back to + and -.
|
| You could make a case that, in order to be of value to
| (most ordinary) programmers, type theory needs to present
| their results in programmer-speak, not in math-speak. But
| then programmers ask questions like "why is that true",
| and the answer is the proof which, being a mathematical
| proof, is in math-speak.
| klibertp wrote:
| > Mathematics has _always_ operated this way, going back
| to + and -.
|
| Wikipedia says[1] the earliest use of characters
| resembling + and - was in 14th century. From what I
| remember, math books were light on special notation until
| (at least) 18th century, and another poster here says it
| could be even newer (beginning of 20th century).
|
| > and the answer is the proof which, being a mathematical
| proof, is in math-speak.
|
| There are many, many proofs in Euclid's elements, and the
| _only_ unusual (ie. not in plain natural language)
| notation used there (from what I remember and after a
| cursory glance now) is using clusters of capital letters
| to denote line segments.
|
| Proofs are just logic, and logic was used for millenia (I
| think?) before someone decided that `[?]` is better than
| "and" and we should all use it.
|
| What I'm trying to say is that the "math-speak" is (or
| should be?) defined by what you're talking about, not in
| what syntax. And if this is true, then using more
| familiar syntax would be better for lowering the barrier
| to entry.
|
| On the other hand, as Twisol notes, the modern terse
| syntax probably has its merits for experts. I'm a casual
| user - I won't be writing papers or checking their
| correctness - so I get all the bad (unfamiliar, strange
| symbols, context dependent syntax) without any good
| parts. :(
|
| [1] https://en.wikipedia.org/wiki/Plus_and_minus_signs
| andrepd wrote:
| > Is it really that much harder to stay within standard
| ASCII?
|
| Sorry, I don't follow. What is ASCII? Can you please
| avoid using computer jargon in your text, it makes my
| head hurt and it's an unnecessary barrier to entry. Can
| you please write instead "7-bit encoding of text
| characters"?
|
| :)
| klibertp wrote:
| No, that's jargon - and I explicitly said it's ok. You
| can _read_ the word just fine, and you can also easily
| search for its meaning. Compare with this:
|
| > ...that much harder to stay [?][?]0x21,0x7D[?]
|
| It may be just me, but I think there's a huge difference
| between the two, even though both are equally made-up.
| joshuamorton wrote:
| Most of this notation predates ASCII and computer
| typesetting in general.
| klibertp wrote:
| I see using this word was a bad choice, I edited the
| post. What I meant was "to stay within the character set
| the reader already knows how to read".
| joshuamorton wrote:
| A more interesting question.
|
| I think there are two answers:
|
| 1. This style of notation was, I believe first really
| developed in Principia Mathematica in 1910ish. Some of
| the proofs in that book are big, and would be vastly
| longer if they were in plain english.
|
| 2. Part of the goal of that book was precision. They
| could reuse existing words, but those have pesky
| connotations that might mislead. So instead they'd either
| be left to invent new terminology "sqaggle x, y lemmy
| ror, x grorple y" or pick something symbologicial.
|
| There are really good conversations about intuitiveness
| of notation (see, e.g.
| https://mathoverflow.net/questions/366070/what-are-the-
| benef...), and I think that comes down to things being
| hard. Notation is almost always created by an expert who
| is making something useful to them. It's not clear that
| the correct expressive notation would even be the same as
| the correct layman's/teaching notation. Those two
| concepts may simply be at odds.
| Twisol wrote:
| The linked MathOverflow answer was written by Dr. Terence
| Tao, and it really is an excellent dissection of the many
| needs levied upon notation. I find the two
| "Suggestiveness" items to be most related to concision as
| practices in research papers.
|
| This bit from the tail end should not be missed, in our
| context:
|
| > ADDED LATER: One should also distinguish between the
| "one-time costs" of a notation (e.g., the difficulty of
| learning the notation and avoiding standard pitfalls with
| that notation, or the amount of mathematical argument
| needed to verify that the notation is well-defined and
| compatible with other existing notations), with the
| "recurring costs" that are incurred with each use of the
| notation. The desiderata listed above are primarily
| concerned with lowering the "recurring costs", but the
| "one-time costs" are also a significant consideration if
| one is only using the mathematics from the given field X
| on a casual basis rather than a full-time one. In
| particular, it can make sense to offer "simplified"
| notational systems to casual users of, say, linear
| algebra even if there are more "natural" notational
| systems (scoring more highly on the desiderata listed
| above) that become more desirable to switch to if one
| intends to use linear algebra heavily on a regular basis.
| klibertp wrote:
| It's a great explanation, thank you. Yes, as a casual
| user I'm concerned with "one-time costs", ie. the barrier
| to entry. I'm glad that I'm not the only one who noticed
| that these costs are real and make "casual usage" harder
| than it could be otherwise.
|
| Your previous comment, though, is much less optimistic
| (to me): you say that, even if they are aware of this,
| the experts have no incentives to use "simplified
| notational systems". That means that I can either become
| an expert myself, find an expert who will translate for
| me, or be unable to make use of the knowledge contained
| in materials for experts. I wish there was some other
| choice, like Google Translate from math to English...
| Twisol wrote:
| Yes, I would like to see more efforts out there like
| Distill in making cutting-edge research more accessible.
| I don't think we can wish away the significant gap in
| expertise and knowledge embodied by research, though.
| Somebody has to put in the effort to understand it for
| themselves and then unpack it for others. What may be one
| page in a research paper could easily turn into dozens.
|
| I myself have been studying category theory (CT) off and
| on for a few years now, trying to get enough of an
| understanding to be able to explain it to others in the
| software engineering (SE) field. I think there's a lot to
| gain from CT, but it's so strongly founded on distant
| mathematical fields like algebraic topology that it's
| very hard to get at its essence and find tractable
| connections from an SE perspective.
|
| Finding good explanations is really a hard research
| problem of its own; it just isn't funded that way.
|
| > That means that I can either become an expert myself
|
| I don't think there's any way around this. If you
| understand a topic, you have obtained some amount of
| expertise thereof. An expert cannot simply translate for
| you; it's not a merely notational difference. There's a
| body of knowledge that must be transferred.
|
| I would recommend finding an expert who's willing to
| correspond with you on occasion, and look for more
| introductory materials like textbooks and position papers
| in the field. If there's a particular goal you have in
| mind -- say there's a specific paper that achieves
| something you have an interest in -- be up front about
| that; it helps focus the explanations and
| recommendations.
|
| Being an expert doesn't mean you have to have a very
| broad base of expertise. In fact, it could be argued that
| _most_ experts are expert in a very, very small and
| focused niche. The smaller the niche, the less you need
| to digest; but the more rarefied the niche, the more
| stable the successive foundations below you need to be.
| qmsldkfjq wrote:
| > My main question here is: why is conciseness a good
| thing?
|
| Going through a single line of text to understand a
| concept is much better than having to go through two or
| more. I believe you should understand that, if your
| definition of conciseness matches mine.
|
| As a programmer, you know first hand that repeating
| oneself is bad practice: it's better to define a function
| which will encapsulate this computation you need to
| repeat rather than using copy and paste. The same goes
| for any language: having vocabulary to embody reccuring
| ideas helps us discuss complex ideas more easily.
|
| > Do you really need to express concepts with a single
| strange character?
|
| It is only strange to you because you don't speak greek.
| Of course many don't: my pedantic point is that Greeks
| might disagree with you about these characters being
| strange, and less pedantically, the scientific community
| will argue that it's not that strange when you belong to
| their group. There are only a handful of them to remember
| in the context of type theory.
|
| I guess you would want these characters, like Gamma (G),
| Delta (D), tau (t) or sigma (s) to be replaced by their
| ascii equivalent (eg. G, D, t, s), or perhaps even more
| evocative names? But even if this would be done, you'd
| still have to understand the underlying concepts they
| represent, such as the typechecking context which is
| usually represented by Gamma.
|
| These concepts are indeed not easy to understand: you
| would have to learn to become a Gorilla in order to be
| able to hold that banana you languish for, or survive in
| the Jungle that surrounds it.
|
| One good thing about using a different character set is
| that they stand out. There's little chance to confuse
| them with words of vernacular English. Greek
| mathematicians actually might have a harder time with
| these notations than you do :)
| klibertp wrote:
| > As a programmer, you know first hand that repeating
| oneself is bad practice
|
| Yes, but we're talking about the syntax/notation here,
| not the underlying concepts. I see it like this: imagine
| a function which has a single argument, in a dynamic
| language so that we have no type annotation. Let's say I
| named the argument "pair_of_ints". Following your logic,
| I should have named it "i2" instead, because then I
| wouldn't have to repeat myself by writing all the
| p,a,r,o,f,n,t,s characters all over the function!
|
| Another way of saying this: yes, it's good to encapsulate
| pieces of logic in subroutines/functions/methods/etc.
| It's _not_ good, however, to name the functions (let 's
| add: in global scope) f, g, h, f2, f3, gh, etc. just to
| avoid typing more characters. Instead, we choose a name
| which can be understood with as little additional context
| as possible.
|
| > having vocabulary to embody reccuring ideas helps us
| discuss complex ideas more easily.
|
| Yes. My problem is when all the entries in the vocabulary
| are one (Unicode) character long.
|
| > I guess you would want these characters, like Gamma
| (G), Delta (D), tau (t) or sigma (s) to be replaced by
| their ascii equivalent (eg. G, D, t, s), or perhaps even
| more evocative names?
|
| The latter, preferably. If something is meant to denote a
| context, I see little reason to name it "G" instead of, I
| don't know, "context"?
|
| > But even if this would be done, you'd still have to
| understand the underlying concepts they represent, such
| as the typechecking context which is usually represented
| by Gamma.
|
| Yes, but then I wouldn't need to work to understand the
| syntax, making the process of understanding the concepts
| easier.
| steego wrote:
| I believe you that it wasn't intended to be insulting.
|
| But let's think about it.
|
| I feel pretty confident that most people on this website
| are capable of learning the core 2400 Chinese characters
| in a year if they spent a few hours a day and that's
| literally a foreign notation. A lot of people learn new
| languages all the time.
|
| Kids who don't want to learn calculus, learn calculus
| every day. The notation isn't just awkward, the concepts
| are too. Yet they learn it.
|
| What we're talking about is a small notation. It's a
| handful of symbols. They work predictably and
| consistently and the people learning it are usually
| familiar with the subject on some level.
|
| It's off-putting because it appears foreign, but the
| concepts and actual mechanics are already familiar with
| most of these readers. They just need a Rosetta Stone to
| help get past the initial awkwardness.
|
| When you look at it from that perspective, it is kind of
| insulting use a hyperbolic word like insurmountable to a
| group of programmers.
|
| Many chose not to learn it because they don't want to be
| bother or don't see the value investing their time.
|
| Throwing words like insurmountable around seems like it
| breeds learned helplessness.
| duped wrote:
| This is coming from my own experience trying to teach
| theoretical concepts via notation to very smart
| engineers. I won't argue vocabulary, but I stand by what
| I say: it's an insurmountable barrier for many people.
|
| Teaching calculus to children is a good example of why.
| They have a teacher to hold their hands and answer
| questions 40 minutes a day with mandatory homework
| assignments.
|
| If you study concepts on your own, you do not have this
| luxury. Its very difficult to internalize notation when
| you do not use it every day. The problem is not the
| notation itself, but the fact that _no notation can be
| categorically searched and referenced_. It cannot be
| typed or entered into google easily. It is rarely
| consistent between authors, which is free of problems if
| you are already fluent in the notation.
|
| Sometimes you can get away with "what does upside down A
| mean" but consider something like `\forall n \in
| \mathbb{N}`. Imagine if you simply had `unsigned int` (I
| don't want to debate the exact implication of unsigned
| int versus the naturals, but it should serve a point).
|
| When I have tried to explain relatively simple notational
| structures to working engineers, the universal feedback
| is along the lines of "I wish there was a good reference
| for notation because I can't understand this or keep up
| with it."
|
| Pseudo code, or something akin to it, may be more
| ambiguous but is much easier to grok and document.
|
| As an aside: something that constantly irks me is the
| celebration of terseness and convenience. This plagues
| many texts and robs novices and experts alike from
| understanding. Notation that is terse to the point of
| resembling arcane incantations is a problem, and it
| bothers me that academic publications in the applied
| sciences don't recognize it as one.
| sweeneyrod wrote:
| I am pretty sure anyone for whom the notation in `\forall
| n \in \mathbb{N}` presents a barrier to learning
| meaningful amounts of type theory will have greater
| problems from lack of mathematical background.
| duped wrote:
| The mathematical background required is just familiarity
| with notation... which is my entire point. It sounds like
| you don't have a good handle on the mathematical pedagogy
| required to be a good engineer today - it's not what you
| could describe as extensive
| siraben wrote:
| Many introductory texts, such as Types and Programming
| Languages explain the notation thoroughly and start _really_
| simple, even starting from the inference rule notation, then
| talking about reduction of untyped arithmetic expressions,
| building up to the standard notation type theory.
| lapinot wrote:
| Looking at java, rust or assembly i find this argument hard
| to swallow ;) It's really worth it to learn some sequent
| calculus basics to be able to read typing rules. Its mostly
| the same thing as a grammar, or a weird inductive type
| ("enum").
| klibertp wrote:
| Look at APL, K, and J. The notation based on keywords is
| not a problem; the problematic notations are the ones which
| abuse strange characters or ideograms, because they require
| you to re-learn the process of reading.
| duped wrote:
| It shouldn't be hard to swallow. Monospaced ascii text that
| is (usually) unambiguous with extensive documentation is
| night and day from formal notation.
|
| At least BNF grammars are legible
| smasher164 wrote:
| The notation really isn't as bad as it originally looks,
| trust me. I was in the same boat recently, but after cracking
| open TAPL, on page 26 you basically get all the notation you
| need for the rest of the book.
|
| Inference rules just transform the traditional format of
| (A && B) -> C
|
| into A B --- C
|
| This format allows you to stack rules on top of each other
| when writing out the derivation of some term, without running
| out of page space.
| duped wrote:
| It seems absurd to care about 'page space' when discussing
| computer science notation...
| BugsBunny1991 wrote:
| I think it's easier to do if you work up to it gradually. I
| read Logic by Wilfrid Hodges a few years back on a whim from
| some recommendation on HN. I didn't think much of the book at
| the time, though I recently picked up Benjamin Pierce's Types
| and Programming Languages. Having seen notation like that
| before made digesting the book much easier (along with
| starting with other introductory texts like Friedman and
| Wand's Essentials of Programming Languages).
| siraben wrote:
| Yeah, Essentials of Programming Languages is also another
| great implementation-oriented text, it has some formal
| things like inductive sets (which pop up in PLT anyway) and
| also lots of programming exercises.
| kubb wrote:
| Constructing type systems that allow you to express the things
| that you need in your program while simultaneously constraining
| you to a safe way of doing that is one of those instances of
| theoretical research that have immense practical importance. It's
| good to see Rust slowly establishing itself in the industry,
| because there's hope that the ideas will gain popularity and
| better type systems will enter the mainstream in the coming
| decades.
| slver wrote:
| The true liberation will be when your type system is just a set
| of libraries written in the same language that the type system
| applies to.
|
| Which also means you'd be able to add new semantical type
| behavior specific to a project, not just specific to a
| language.
|
| What we're doing right now it's highly redundant, it's
| basically two languages in one. One describes the static
| constraints, and another the runtime ones. There's no need for
| that.
| yakubin wrote:
| For the curious: the title seems to echo the excellent
| "Counterexamples in Topology"[1].
|
| [1]: <https://en.wikipedia.org/wiki/Counterexamples_in_Topology>
| mcphage wrote:
| That is a fun book :-)
| [deleted]
| dan-robertson wrote:
| Yeah it's a bit of a theme. See also:
| https://files.eric.ed.gov/fulltext/ED536733.pdf
___________________________________________________________________
(page generated 2021-05-24 23:03 UTC)