[HN Gopher] Category Theory Illustrated - Natural Transformations
       ___________________________________________________________________
        
       Category Theory Illustrated - Natural Transformations
        
       Author : boris_m
       Score  : 178 points
       Date   : 2025-10-01 08:00 UTC (15 hours ago)
        
 (HTM) web link (abuseofnotation.github.io)
 (TXT) w3m dump (abuseofnotation.github.io)
        
       | larodi wrote:
       | this has surfaced HN top at least 5 times
       | 
       | https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
        
         | phoronixrly wrote:
         | It has been updated as per the author's Mastodon
         | https://mathstodon.xyz/@abuseofnotation/115298450513159834
        
       | auggierose wrote:
       | There are too many pictures in this for my taste. I am currently
       | reading this one, and I like it better so far:
       | https://doi.org/10.1142/13670
        
         | IdontKnowRust wrote:
         | It's author's intention, since the title explicitly says
         | "Illustrated" =)
        
           | auggierose wrote:
           | Fair enough.
        
       | hamburgererror wrote:
       | What's the thing with category theory? I see this topic discussed
       | quite frequently here but I don't get it why people are so into
       | it
        
         | jiggunjer wrote:
         | Lets software designers use fancy words and ask for a raise.
        
         | monarchwadia wrote:
         | For me.. It's a very useful mental model for thinking about
         | architecture & logic
        
           | hamburgererror wrote:
           | Why not simply use UML?
        
             | emmelaich wrote:
             | Very different thing.
             | 
             | CT is more of a way to abstract all mathematics.
        
             | ndriscoll wrote:
             | UML doesn't give ideas for how to actually structure
             | things. Category theory is primarily a theory of nice ways
             | things can be put together or form relationships while
             | maintaining invariants.
        
           | ctenb wrote:
           | Examples? I haven't really seen many applications of CT, even
           | though I looked for them since I find the idea of CT
           | interesting
        
             | moralestapia wrote:
             | You have a function that does A() and another function that
             | does B().
             | 
             | Upon careful inspection or after just writing/using them
             | 10,000s of times[1] you realize they are both special cases
             | of one general function f()[2]. Congrats, you're likely
             | doing CT now, but barely scratching the surface, though.
             | 
             | Let's say you find a way to do a function factory that
             | generates explicit instances of f() -> A() and f() -> B()
             | at runtime for your different use cases as they are needed.
             | You do this 100 times, 1,000 times[1] with many different
             | functions, in many different contexts. You eventually
             | realize that if all your functions and their signatures had
             | the same _structure_ [3] it would be quite easy to mix some
             | (or all?) of them with each other, allowing you to handle a
             | perhaps infinite amount of complexity in a way that's very
             | clean to conceptualize and visualize. Isn't this just FP?
             | Yes, they're very intimately related.
             | 
             | By this point you're 99.9999% doing CT now, but remember to
             | shower regularly, touch grass etc.
             | 
             | CT formalized these structures with mathematical language,
             | and it turns out that this line of thinking is very useful
             | in many fields like ours (CS), Math, Physics, etc.
             | 
             | 1. Which is what happened to me.
             | 
             | 2. Which sometimes is a way more elegant and simple
             | solution.
             | 
             | 3. This term is fundamental and has way more meaning than
             | what I could write here and what one would think on a first
             | approach to it.
        
               | charcircuit wrote:
               | Building an abstraction to pull out common logic does not
               | require category theory.
        
         | siddboots wrote:
         | It's just a good set of models to use to think about all sorts
         | of different mathematical systems, kind of like a unified
         | vocabulary. Beyond undergraduate level, category theory these
         | days plays a huge role within many vast fields - e.g.,
         | algebraic geometry, algebraic topology, or representation
         | theory.
        
           | sesm wrote:
           | I think your reply overstates the importance of category
           | theory in mathematics and doesn't give any hint on what it is
           | about.
           | 
           | IMO a better reply would be: category theory appeared to
           | unify the concepts around using discrete objects to prove the
           | properties of continous objects in topology, like fundamental
           | groups, homology groups and homothopy groups. It is only
           | practically useful for very advanced proofs like 2nd Weil
           | Conjecture. Any usage of it in programming is only an analogy
           | and is not mathematically rigorous (see
           | https://math.andrej.com/2016/08/06/hask-is-not-a-category/)
        
             | Iwan-Zotow wrote:
             | Wasn't that corrected already? I mean categorical
             | definition of Hask?
        
               | sesm wrote:
               | If it was, I would like to see the link
        
         | tristramb wrote:
         | Category theory is what you get when you take mappings instead
         | of sets as the primitive objects of your universe. At first
         | this might seem a perverse thing to do as mappings seem more
         | complex than sets, but that is just because traditionally
         | mappings have usually been defined in terms of sets.
         | 
         | In set theory you can specify that two sets be equal and you
         | can also specify that one set be an element of another.
         | 
         | In category theory you can specify that two mappings be equal
         | and you can also specify that two mappings compose end to end
         | to produce a third mapping.
         | 
         | Category theory can be used to express some requirements in a
         | very concise way.
        
           | thomasahle wrote:
           | > Category theory can be used to express some requirements in
           | a very concise way.
           | 
           | Can you an example?
        
             | tristramb wrote:
             | Take a mapping a and precompose it with the identity
             | mapping i. By the definition of the identity mapping the
             | resulting composition is equal to a.                 i;a =
             | a
             | 
             | (Here ';' represents forward composition. Mathematicians
             | tend to use backward composition represented by '[?]' but I
             | find backward composition awkward and error-prone and so
             | avoid using it.)
             | 
             | Now, if there is another mapping j that is different from
             | i, such that                 j;a = a
             | 
             | then the mapping a loses information. By this I mean that
             | if you are given the value of a(x) you cannot always
             | determine what x was. To understand this properly you may
             | need to work through a simple example by drawing circles,
             | dots and arrows on a piece of paper.
             | 
             | If there is no such j then mapping a is said to be a
             | monomorphism or injection (the set theoretic term) and it
             | does not lose information.
             | 
             | This specification of the property 'loses information' only
             | involves mapping equality and mapping composition. It does
             | not involve sets or elements of sets.
             | 
             | An example of a mapping that loses information would be the
             | capitalization of strings of letters. An example of a
             | mapping that you would not want to lose information would
             | be zip file compression.
             | 
             | If you alter the above specification to use post-
             | composition (a;i = a and a;j = a) instead of pre-
             | composition you get epimorphisms or surjections which
             | capture the idea that a mapping constrains all the values
             | in its codomain. I like to think of this as the mapping
             | does not return uninitialized values or 'junk' as it is
             | sometimes called.
             | 
             | Bartosz Milewski works through this in more detail
             | (including from the set-theoretic side) in the last 10
             | minutes of https://www.youtube.com/watch?v=O2lZkr-
             | aAqk&list=PLbgaMIhjbm... and the first 10 minutes of https:
             | //www.youtube.com/watch?v=NcT7CGPICzo&list=PLbgaMIhjbm....
        
           | tristramb wrote:
           | So category theory is really the theory of composition of
           | mappings. I conjecture that all programming can be seen as
           | just the composition of mappings. If this is correct then
           | category theory is a theory of programming.
        
             | sesm wrote:
             | You don't need category theory to connect dots with arrows,
             | graph theory is enough for this.
        
               | Twey wrote:
               | Category theory is actually a 'simplified' graph theory,
               | i.e. you can see categories as a restricted class of
               | graphs. E.G. 'Category Theory for Computing Science'
               | introduces categories this way (a category is a directed
               | graph with associative composition and identity; the free
               | category on a graph is the graph with all identities and
               | compositions filled in). But the restrictions
               | (associative composition and identity) are harmless and
               | natural for programming applications where there's always
               | a notion of 'do nothing' or 'do one thing after another',
               | and unlock a lot of higher structure.
        
               | sesm wrote:
               | But what's the utility of this definition? Does it help
               | solve or prove something?
        
               | Twey wrote:
               | It helps you build an intuition for categories, if you're
               | used to graphs :)
               | 
               | If you have a working intuition for categories then in
               | most cases the specific formulation you choose as a
               | foundation doesn't matter, just as most mathematicians
               | work nominally in set theory without worrying about the
               | subtleties of ZFC.
        
               | griffzhowl wrote:
               | If you allowed infinite graphs maybe. How would you
               | define a functor or natural transformation in graph
               | theory? Seems like you would need to construct a
               | conceptual system that is just equivalent to category
               | theory
        
               | xanderlewis wrote:
               | No, but if you want to talk about _composing_ those
               | arrows (and a sensible notion of composition should
               | probably be associative, and perhaps have a unit) you
               | eventually end up reinventing category theory.
        
             | ihm wrote:
             | It can very much be. Here's one example of this phenomenon
             | (there are many others but this is the most famous):
             | https://wiki.haskell.org/Curry-Howard-Lambek_correspondence
        
           | ctenb wrote:
           | > you can also specify that two mappings compose
           | 
           | Two mappings with corresponding domain/codomain _have_ to
           | compose by definition of a category. It 's not something you
           | can specify.
        
             | ajkjk wrote:
             | That is probably what they mean by specifying that they
             | compose.
             | 
             | If all you know is that you have two mappings you don't
             | know they compose, until you get the additional information
             | about their sources and targets. In a way that's what the
             | source and targets are: just labels of what you can compose
             | them with.
        
             | tristramb wrote:
             | Yes. When you are specifying a system you are building the
             | category that you want it to live in.
        
           | alfiedotwtf wrote:
           | > Category theory is what you get when you take mappings
           | instead of sets as the primitive objects of your universe
           | 
           | Why have I never seen it explained like this before. Wow,
           | thank you!
        
           | stared wrote:
           | > Category theory can be used to express some requirements in
           | a very concise way.
           | 
           | Could you give an example in programming, what can be easier
           | expressed in CT than with sets and functions?
        
             | griffzhowl wrote:
             | It's more that category theory foregrounds the functions
             | themselves, and their relationships, rather than on the
             | elements of sets which the functions operate on. This
             | higher-level perspective is arguably the more appropriate
             | level when thinking about the structure of programs.
             | 
             | For more detail, see Bartosz Milewski, Category Theory for
             | Programmers:
             | 
             | "Composition is at the very root of category theory -- it's
             | part of the definition of the category itself. And I will
             | argue strongly that composition is the essence of
             | programming. We've been composing things forever, long
             | before some great engineer came up with the idea of a
             | subroutine. Some time ago the principles of structured
             | programming revolutionized programming because they made
             | blocks of code composable. Then came object oriented
             | programming, which is all about composing objects.
             | Functional programming is not only about composing
             | functions and algebraic data structures -- it makes
             | concurrency composable -- something that's virtually
             | impossible with other programming paradigms."
             | 
             | https://bartoszmilewski.com/2014/10/28/category-theory-
             | for-p...
        
           | griffzhowl wrote:
           | > Category theory is what you get when you take mappings
           | instead of sets as the primitive objects of your universe.
           | 
           | I'm not sure about that, because you still need some concept
           | of set (or collection or class) to define a category, because
           | you need a set of objects and mappings between them
           | (technically that's a "small" category, but to define any
           | larger category would require at least as much set-
           | theoretical complication).
           | 
           | More exactly, whereas in set theory it's the membership
           | relation between sets and their elements that is basic, in
           | category theory it's the mapping between objects.
           | 
           | Nevertheless, the basic concepts of set theory can also be
           | defined within category theory, so in that sense they're
           | inter-translatable. In each case though, you need some
           | ambient idea of a collection (or class or set) of the basic
           | objects. Tom Leinster has a brilliantly clear and succinct (8
           | pages) exposition of how this is done here
           | https://arxiv.org/abs/1212.6543
           | 
           | The thing is, even defining first-order logic requires a
           | (potentially infinite) collection of variables and constant
           | terms; and set theory is embedded in first-order logic, so
           | both set theory and category theory are on the same footing
           | in seemingly requiring a prior conception of some kind of
           | potentially infinite "collection". To be honest I'm a bit
           | puzzled as to how that works logically
        
         | jesuslop wrote:
         | Its of course the theory behind monads that since Eugenio Moggi
         | are used to model computational effects in pure functional
         | languages. Effects such as state, optional return types (used
         | in turn for error handling) (maybe monad), input/output (reader
         | writer monad) and others. Beyond effects, Wadler used monads
         | for parsers (monadic parsing).
         | 
         | The Curry-Howard "isomorphism" (slogan: propositions are types,
         | proofs are programs/functions) map code to logic in a
         | categorical way described first by certain book of Lambek-Scott
         | with uses in formal software verification.
         | 
         | Categories provide abstraction. You first distill the behavior
         | of how Haskell (or you other pet functional language) work with
         | Hask, the category of Haskell types, and then you can apply
         | your abstract distillate to other categories and obtain task-
         | oriented, tailored computing concepts that enrich bare language
         | capabilities, providing applications including 1) probabilistic
         | programs 2) automatic differentiation. Conal Elliott has very
         | concrete work along this lines. When he speaks of CCCs
         | (following Lambek) he alludes to cartesian closed categories,
         | the crucial property of having a type constructor for function
         | spaces and higher order functions. See his "compiling to
         | categories" for a very concrete, hands-on feel. Another
         | application he shows is in hardware synthesis (baking your
         | functional algorithm to a netlist of logical gates for
         | automating the design of custom hw accelerators).
         | 
         | In short, why categories? computational effects, formal
         | verification and the equivalence of simply-typed lambda-
         | calculus with cartesian closed categories, with lambda-calculus
         | being the backbone of functional programming language
         | semantics.
        
           | Twey wrote:
           | I'd phrase this a tiny bit differently: monads give a model
           | of effects in _impure_ languages and are important for that
           | reason. The fact that Haskell chooses to emphasize monads in
           | the language itself is cool, but their utility is not
           | restricted to pure functional languages; quite the opposite!
           | In a pure functional language you don't need to think about
           | effects at all, and the only model you need is mathematical
           | functions, which are much simpler.
        
         | Twey wrote:
         | Category theory is popular in computer science because, at a
         | fundamental level, they're very compatible ways of seeing the
         | world.
         | 
         | In computing, we think about:
         | 
         | - a set of states
         | 
         | - with transformations between them
         | 
         | - including a 'do nothing' transformation
         | 
         | - that can be composed associatively (a sequence of statements
         | `{a; b;}; c` transforms the state in the same way as a sequence
         | of statements `a; {b; c;}`)
         | 
         | - but only in certain ways: some states are unreachable from
         | other states
         | 
         | This is exactly the sort of thing category theory studies, so
         | there's a lot of cross-pollination between the disciplines.
         | Computation defines interesting properties of certain
         | categories like 'computation' or 'polynomial efficiency' that
         | can help category theorists track down interesting beasts to
         | study in category theory and other branches of mathematics that
         | have their own relationships to category theory. Meanwhile,
         | category theory can give suggestions to computer science both
         | about what sort of things the states and transformations can
         | mean and also what the consequences are of defining them in
         | different ways, i.e. how we can capture more expressive power
         | or efficiency without straying too far from the comfort of our
         | 'do this then do that' mental model.
         | 
         | This latter is really helpful in computer science, especially
         | in programming language or API design, because in general it's
         | a really hard problem to say, given a particular set of basic
         | building blocks, what properties they'll have when combined in
         | all the possible ways. Results in category theory usually look
         | like that: given a set of building blocks of a particular form,
         | you will always be able to compose them in such a way that the
         | result has a desired property; or, no matter how they're
         | combined, the result will never have a particular undesired
         | property.
         | 
         | As an aside, it's common in a certain computer science
         | subculture (mostly the one that likes category theory) to talk
         | about computing in the language of typed functional
         | programming, but if you don't already have a deep understanding
         | of how functional programming represents computation this can
         | hide the forest behind the trees: when a functional programmer
         | says 'type' or 'typing context' you can think about sets of
         | potential (sub)states of the computer.
        
           | pdpi wrote:
           | > - with transformations between them         >         > -
           | including a 'do nothing' transformation         >         > -
           | that can be composed associatively (a sequence of statements
           | `{a; b;}; c` transforms the state in the same way as a
           | sequence of statements `a; {b; c;}`)
           | 
           | And this right here is that monoid in the famous "A monad is
           | just a monoid in the category of endofunctors" meme.
        
           | stared wrote:
           | Still, what's in your opinion, the advantage of thinking in
           | category theory rather than set theory? (For programming, not
           | - algebraic geometry.)
           | 
           | I mean, all examples I heard can be directly treated with
           | groups, monoids, and regular functions.
           | 
           | I know some abstract concepts that can be defined in a nice
           | way with CT but not nearly as easy - set theory, e.g.
           | (abstract) tensor product. Yet, for other concepts, including
           | quantum mechanics, I have found that there is "abstract
           | overhead" of CT with little added value.
        
             | Twey wrote:
             | In my opinion, the important advantage of category theory
             | over set theory in (some!) computational contexts is that
             | it allows you to generalize more easily. Generalizing from
             | sets and functions to objects and morphisms lets you play
             | around with instantiating those objects and morphisms with
             | a variety of different beasts while maintaining the
             | structure you've built up on top of them, and you can even
             | use it to modularly build towers of functionality by
             | layering one abstraction on top of another, even if you
             | choose later to instantiate one of those layers with good
             | old sets and functions. By contrast, it's hard to imagine
             | treating something like async functions with plain old set
             | theory: while there is of course a way to do it, you'd have
             | to reason about several different layers of abstraction
             | together to get all the way down to sets in one step.
        
         | coderatlarge wrote:
         | one of the things i took away about category theory is that it
         | allows you to avoid repeating certain arguments which boil down
         | to so-called "abstract nonsense" ie they have nothing to do
         | with the specific objects you're dealing with but rather are a
         | consequence of very generic mapping relationships between them.
         | maybe better versed people can give specifics.
         | 
         | as a very broad example there are multiple ways to define a
         | "homology" (ex simplicial, singular, etc) functor associating
         | certain groups to topological spaces as invariants. but the
         | arguments needed to prove properties of the relationships
         | between those groups can be derived from very general
         | properties of the definitions and don't need to be re-argued
         | from the very fine definitions of each type of homology.
         | 
         | i think.
        
         | shae wrote:
         | At the 2018(?) ICFP, I sat between John Wiegley and Conal
         | Elliot. They talked about expressing and solving a programming
         | problem in category theory, and then mapping the solution into
         | whatever programming language their employer was using. From
         | what they said, they were having great success producing
         | efficient and effective solutions following this process.
         | 
         | I decided to look for other cases where this process worked.
         | 
         | I found several, but one off the top of my head is high
         | dimensional analysis, where t-SNE was doing okay, and a group
         | decided to start with CT and try to build something better, and
         | produced UMAP, which is much better.
         | 
         | In short, this does work, and you can find much better
         | solutions this way.
         | 
         | (random link
         | https://stats.stackexchange.com/questions/402668/intuitive-e...
         | )
        
           | w10-1 wrote:
           | > you can find much better solutions this way
           | 
           | ... because mappings map nicely to functions
        
         | esafak wrote:
         | A lot of us don't get it and want to know what we're missing :)
        
         | T-R wrote:
         | Abstract Algebra, looked at through the lens of Programming, is
         | kind of "the study of good library interface design", because
         | it describes different ways things can be "composable", like
         | composing functions `A -> B` and `B -> C`, or operators like `A
         | <> A -> A`, or nestable containers `C<C<T>> -> C<T>`, with laws
         | clearly specifying how to ensure they don't break/break
         | expectations for users, optimizers, etc. Ways where your output
         | is in some sense the same as your input, so you can break down
         | problems, and don't need to use different functions for each
         | step.
         | 
         | Category Theory's approach of "don't do any introspection on
         | the elements of the set" led it to focus on some structures
         | that turned out to be particularly common and useful (functors,
         | natural transformations, lenses, monads, etc.). Learning these
         | is like learning about a new interface/protocol/API you can
         | use/implement - it lets you write less code, use out-of-the-box
         | tools, makes your code more general, and people can know how to
         | use it without reading as much documentation.
         | 
         | Focusing on these also suggests a generally useful way to
         | approach problems/structuring your code - rather than
         | immediately introspecting your input and picking away at it,
         | instead think about the structual patterns of the computation,
         | and how you could model parts of it as transformations between
         | different data structures/instances of well-known patterns.
         | 
         | As a down-to-earth example, if you need to schedule a bunch of
         | work with some dependencies, rather than diving into hacking
         | out a while-loop with a stack, instead model it as a DAG,
         | decide on an order to traverse it (transform to a list), and
         | define an `execute` function (fold/reduce). This means just
         | importing a graph library (or just programming to an interface
         | that the graph library implements) instead of spending your day
         | debugging. People generally associate FP with recursion, but
         | the preferred approach is to factor out the control flow
         | entirely; CT suggests doing that by breaking it down into
         | transformations between data structures/representations. It's
         | hugely powerful, though you can also imagine that someone who's
         | never seen a DAG might now be confused why you're importing a
         | graph library in your code for running async jobs.
        
         | tikhonj wrote:
         | Category theory gives us a nice, high-level set of conceptual
         | tools to try to understand and generalize over things that are
         | hard to connect otherwise. Some people find that useful
         | directly, other people just enjoy it for its own sake, or even
         | for aesthetic reasons. (I think all three are totally
         | reasonable!)
         | 
         | At the same time, it's actually rather _more_ accessible than
         | most other areas of pure math--at least at the level that
         | people talk about it online. Basic category theory can be hard
         | to learn because it 's so abstract but, unlike almost any other
         | are of math from the 20th century onwards, it has almost no
         | hard prerequisites. You can reasonably learn about categories,
         | functors, natural transformations and so on without needing a
         | graduate degree's worth of math courses first. You might not
         | understand the most common _examples_ mathematicians use to
         | illustrate category theory ideas--but it 's such a general
         | framework that it isn't hard to find alternate examples from
         | computer science or physics or whatever else you already know.
         | In fact, I expect most of the articles that get talked about
         | here do exactly that: illustrate category theory ideas with
         | CS/programming examples that folks on HN find relevant and
         | accessible.
        
           | senderista wrote:
           | > You can reasonably learn about categories, functors,
           | natural transformations and so on without needing a graduate
           | degree's worth of math courses first.
           | 
           | This is the whole premise of _Conceptual Mathematics_:
           | category theory for high school students.
        
         | chermi wrote:
         | Pinging the https://planting.space/ people! I know some of them
         | are on HN, at least recruiting in the past. I haven't updated
         | my knowledge in a while, but my impression of the company was
         | basically that they were going make money using category
         | theory(+all math) in clever and useful ways. I think they
         | turned toward AI a little, but they're at the root a bunch of
         | people who think category theory is useful. Hence, the ping!
        
       | sesm wrote:
       | > In the course of this book, we learned that
       | programming/computer science is the study of the category of
       | types in programming languages.
       | 
       | This is a golden quote.
        
         | ctenb wrote:
         | It's also wrong, since computer science is traditionally mostly
         | about computation, which has nothing to do with CT
        
           | Twey wrote:
           | Insofar as 'computation' is about mapping one state or value
           | to another state or value, it has a lot to do with CT!
           | 
           | The question of whether CT is _useful_ for studying
           | computation is different, and there are certainly other
           | lenses you can see computation through that some people would
           | argue are better. But it's hard to deny that they're
           | _related_.
        
             | ctenb wrote:
             | I mean, technically almost all of math can be related to
             | other math one way or another. To say the CT has a _lot_ to
             | do with computation is definitely a stretch. CT is not a
             | recognized Computer Science subject. It 's mostly used in
             | the functional programming community to name certain
             | concepts and theorems, but then applied to a specific type
             | system (so it's not actually doing CT, since your
             | restricting yourself to a single category, whereas CT is
             | really about connecting different categories by
             | generalizing over them).
        
               | sesm wrote:
               | Good point, algebraic topology is mostly concerned with
               | Top -> Grp functors to prove the properties of continuous
               | transformation, while Haskel community focuses on Hask ->
               | Hask endofunctors to use fancy names for mundane things.
        
               | Twey wrote:
               | > To say the CT has a lot to do with computation is
               | definitely a stretch.
               | 
               | I think the argument I presented above adequately
               | justifies CT as fundamentally connected to computation,
               | at least as we study it today, though I think there are
               | other formalisms as well that are just as worthy.
               | 
               | > CT is not a recognized Computer Science subject
               | 
               | I'm not sure who decides this :) Certainly when I studied
               | computer science we had modules on category theory, along
               | with other applicable 'discrete math' subjects.
               | 
               | CS is a bit of a grab-bag of a discipline, so it's very
               | hard to say what does or doesn't belong on a CS
               | curriculum. But I think programming language semantics,
               | which is the field where you see CT pop up the most, is
               | probably uncontroversially CS, or at least I don't know
               | what other discipline would claim it.
               | 
               | > It's mostly used in the functional programming
               | community to name certain concepts and theorems
               | 
               | It's mostly used in the PL community; its use in
               | functional programming is an attempt to connect the
               | language more directly to that lineage (or because
               | functional programming languages traditionally came from
               | that community, where the language is already understood
               | and available to describe things).
               | 
               | > but then applied to a specific type system (so it's not
               | actually doing CT, since your restricting yourself to a
               | single category, whereas CT is really about connecting
               | different categories by generalizing over them)
               | 
               | I agree that there's not a lot of interesting category
               | theory to do when restricted to a single category, but
               | this isn't quite correct. Some type systems, e.g. simply
               | typed lambda calculus, are adequately _described_ by a
               | single category, though in more advanced cases you
               | usually need more than one (e.g. dependently typed
               | languages are often described by 'categories with
               | families', which are a slightly more sophisticated
               | category-theoretical construction). But even within a
               | single type system you usually find many categories; for
               | example a type system can often be described by a
               | category of terms that map between its typing contexts,
               | but within that category you also have a category of
               | types and functions between them, a category of
               | (functorial) type constructors and natural
               | transformations, et cetera. An important family of
               | examples are the Kleisli categories over monads: every
               | monad actually defines a category of its own, whose
               | categorical laws give the good behaviour of monads that
               | make them useful for composing effectful computations.
               | Interactions between these different categories are where
               | inspiration comes from for CT-flavoured API design, e.g.
               | the 'interpreter' pattern that's popular in Haskell (http
               | s://softwareengineering.stackexchange.com/questions/2427.
               | ..). Even in pure CT the study of 'connecting different
               | categories' can be seen mostly as taking a sufficiently
               | close look at the category `Cat` of categories and
               | functors!
        
       | gnarlouse wrote:
       | When I read this title I thought it was going to be rings and
       | groups in bikinis. I'm so dumb.
        
       | michaelcampbell wrote:
       | Had to reduce the page to 67% to get out of "Fisher Price" font
       | size, but otherwise quite interesting.
        
       | YetAnotherNick wrote:
       | It makes it so much more complicated than what is needed to
       | understand natural transformation. Natural transformation is just
       | mapping between two functors. You can discover the laws yourself
       | just from this.
        
       | gcr wrote:
       | Anyone who likes this might also like Stefan Miller's paper, "a
       | simple category theoretical understanding of category theory
       | diagrams", appearing in SIGBOVIK 2014. See
       | https://sigbovik.org/2014/proceedings.pdf (starts on PDF page 65,
       | or page 57 if you go by margin page numbers)
        
       | anal_reactor wrote:
       | I hate this particular mix of prose and formalism. Too
       | complicated to be pop-sci, too informal to be, well, formal. I
       | got to this part:
       | 
       | > We know that two orders are isomorphic if there are two
       | functors, such that going from one to the other and back again
       | leads you to the same object.
       | 
       | And I have no clue what is a functor, nor order. "Functor" wasn't
       | defined, and "order" is defined as "thin category", which in turn
       | remains undefined.
       | 
       | Seems to me like in order to understand this text you already
       | need to understand category theory. If that's the case, then why
       | would you be reading it?
        
         | gs17 wrote:
         | It's the newest chapter of a book, the previous one defines
         | functors.
        
         | mjh2539 wrote:
         | I agree. There was (and still is) a trend in technical writing
         | that began in the 2010s to be overly pedestrian and informal
         | (and in many cases explicitly vulgar). The same impulse or
         | geist resulted in many people naming their library or product a
         | cute or contrived or irrelevant word.
         | 
         | Get off my lawn! And start giving things long descriptive names
         | that are aliased to acronyms again! db2, netcat, socat, emacs
         | (editing macros), wget...etc.
        
       | rck wrote:
       | This is fun. But the bit at the beginning about philosophy is not
       | correct. Parmenides did not believe in what we would call
       | essences, but really did believe that nothing ever changes (along
       | with his fellow Eliatic philosopher Zeno, of paradox fame). The
       | idea that change is an illusion is pretty silly, and so Plato and
       | especially Aristotle worked out what's wrong with that and
       | proposed the idea of _forms_ in part to account for the nature of
       | change. Aristotle extended Plato's idea and grounded it in
       | material reality which we observe via the senses, and that's
       | where the concept of essence really comes from - "essence" comes
       | from the Latin "essentia" which was coined to deal with the
       | tricky Greek ousia (ousia - "being") that Aristotle uses in his
       | discussions of change.
        
         | griffzhowl wrote:
         | One way I've seen it presented is that the early Greek
         | philosophers were grappling with how to reconcile two basic
         | facts: somethings stay the same (constancy or regularity), and
         | some things change.
         | 
         | Heraclitus was before Parmenides and said that everything
         | changes. Parmenides said that nothing changes, and then the
         | atomists, most prominently Democritus, synthesised these two
         | points of view by saying that there are atoms which don't
         | change, but all apparent change is explained by the relative
         | motions of the different basic atoms. Plato was influenced by
         | all of these. But I would say the theory of forms accounts more
         | for constancy or regularity more than change, no?
         | 
         | Btw, the central concept of Parmenides' philosophy is always
         | translated as "Being", but I couldn't find the original Greek
         | word. It isn't "ousia"?
        
           | rck wrote:
           | I'm not sure what motivated Parmenides because he was more of
           | a poet than anything - it just happened that his poetry was
           | what we would now recognize as incredibly philosophical. He
           | didn't really argue, he just wrote down what the "goddess"
           | told him. But I think the basic problem is that everyone back
           | then agreed that you can't get "something from nothing," and
           | it sure seems like change requires being to come from non-
           | being. The statue is there now, but before it was cast there
           | wasn't a statue, just a chunk of bronze. If being can't come
           | from non-being, how do you account for the "coming-to-be" of
           | the statue? The Eliatic position as I understand it is that
           | the change is just an illusion. Plato and Aristotle both
           | react against this position and argue that it's silly (I'm
           | very inclined to agree). They then give alternative accounts
           | of what change really is.
           | 
           | I'm not sure about Plato, but the Aristotelian analysis is
           | something like this: every thing that exists has the
           | potential to exist in certain ways and not others, and it's
           | said that the thing is "in potency" to exist in those
           | potential ways. When something could exist in a certain way
           | but right now doesn't, that's called a "privation." And the
           | ways that the thing currently does exist are the "form" of
           | the thing. So a substance changes when it goes from being in
           | potency to being actual, and it does that by losing a
           | privation. Aquinas follows Aristotle in giving the example:
           | "For example, when a statue is made from bronze, the bronze
           | which is in potency to the form of the statue is the matter;
           | the shapeless or undisposed something is the privation; and
           | the shape because of which it is called a statue is the
           | form." Incidentally, Aquinas's short On the Principles of
           | Nature (https://aquinas.cc/la/en/~DePrinNat) is a good
           | overview of this theory, which is spread all over Aristotle
           | (in the Categories, the Physics, and the Metaphysics).
           | 
           | As far as ousia is concerned, I think this is the complete
           | Greek for Parmenides's poem:
           | http://philoctetes.free.fr/parmenidesunicode.htm. In the
           | places where that translation uses "being" you get slightly
           | different words like genesthai (to come into a new state of
           | being) or einai (just the infinitive "to be"). And looking at
           | the definition of ousia
           | (https://lsj.gr/wiki/%CE%BF%E1%BD%90%CF%83%CE%AF%CE%B1) it
           | looks like most of the uses of that term specifically come
           | well after Parmenides.
        
       | w10-1 wrote:
       | I like it when teachers (e.g., Grant Sanderson) are careful to
       | explain when they are trying to convey an intuition to motivate
       | and guide some complex math, because it orients you without
       | tangling you in all the misunderstanding that would come from
       | extending analogies or cross-cultural/discipline comparisons too
       | far.
       | 
       | But when authors start slinging around Plato and Aristotle and
       | especially Parmenides willy-nilly alongside modern principles,
       | they're waving a red flag... Don't get me started!
        
       | mallowdram wrote:
       | Isomorphism invariance applies to neural assemblies or syntax,
       | not to mere symbols. The problem in math is it models. Brains do
       | not model. Heraclitus was right if math never enters the picture
       | to add its arbitrariness. "A man in the night kindles a light for
       | himself when his sight is extinguished; living he is in contact
       | with the dead when asleep, when awake he is in touch with the
       | sleeper."
        
       | intalentive wrote:
       | Interesting aside about the Vienna circle and isomorphism. I
       | suspect that's where Hayek got his idea that mind and
       | representation are isomorphic, echoing Aristotle's assertion in
       | "On the Soul" / De Anima that the mind becomes the object of
       | perception.
        
       | measurablefunc wrote:
       | The natural transformation a : F = G is not specified properly
       | b/c when expressed in compositional form you also have to specify
       | the subscript for the natural transformation & it is an equality
       | instead of an isomorphism, i.e. if f : a - e then a[?] [?] Ff =
       | Gf [?] a[?]. There are highter categories where what he has
       | written down can make sense but in the context of the current
       | exposition it is not correct as written.
        
       | ibobev wrote:
       | The author is Jencel P.? I saved this book sometime ago under the
       | author name Boris Marinov? Is this the same person now writing
       | under a different pen name?
        
         | larodi wrote:
         | Is the same. He seems to obscure himself deliberately.
        
       | VirusNewbie wrote:
       | The author uses adjoint functors to explain equivalence and
       | naturality but doesn't actually call it that?
        
       ___________________________________________________________________
       (page generated 2025-10-01 23:01 UTC)