[HN Gopher] A Review of the Art of the Metaobject Protocol (2010...
___________________________________________________________________
A Review of the Art of the Metaobject Protocol (2010) [pdf]
Author : Tomte
Score : 80 points
Date : 2022-08-15 11:46 UTC (11 hours ago)
(HTM) web link (www.dreamsongs.com)
(TXT) w3m dump (www.dreamsongs.com)
| [deleted]
| iainctduncan wrote:
| Does anyone know if this review is published somewhere such that
| one can cite it properly? (ie where it came from?) The pdf has
| author information but no publishing information on the front
| page. thanks!
| Jtsummers wrote:
| List of his publications, including this one:
|
| https://www.dreamsongs.com/Essays.html
|
| He states that he wrote it for the journal _Artificial
| Intelligence_ in 1993. That led me to Elsevier 's website for
| the journal, which absolutely blows. Clicking around, it was
| this issue: https://www.sciencedirect.com/journal/artificial-
| intelligenc...
| iainctduncan wrote:
| thanks!
| eadmund wrote:
| There's some real gold in here:
|
| > Though it seems funny now to think that Scheme lexical
| environments were ever difficult to understand, you must take my
| word that in 1976, people with PhDs in computer science required
| careful explanation with lots of examples to get it. In fact,
| many people -- myself included -- had to write lexical
| interpreters to understand the interactions that led to the
| external behavior, and understanding the interactions was
| required to get it.
|
| Nowadays lexical scope is the default, and dynamic scope is hard
| to explain to folks who don't understand it yet. Makes you wonder
| how many hard problems would be easy with a different frame of
| reference.
|
| > Nevertheless, in 1980--to pick a round date--the AI folks and
| programming languages folks parted ways. And with their departure
| came programming language winter.
|
| Although I imagine that a lot of folks would strongly disagree,
| there is a very real sense in which programming languages haven't
| really progressed since the early 80s.
|
| > These interpreters became Scheme, and from them they learned
| that function calling and message-passing were fundamentally the
| same, a lesson still not well and thoroughly learned today.
|
| Surprisingly, this still seems true a dozen years later.
| bmitc wrote:
| Is dynamic scope hard to understand or be explained or isn't it
| just the case that most find it to be a terrible idea for
| general purpose software development?
| slaymaker1907 wrote:
| It is definitely useful for things like semi-global options
| like a variable for the current stdout.
| aardvark179 wrote:
| It does seem hard to explain to people, and although I agree
| it is not normally the right answer, it is often a better
| answer than simple thread local variables.
| eadmund wrote:
| > Is dynamic scope hard to understand or be explained or
| isn't it just the case that most find it to be a terrible
| idea for general purpose software development?
|
| I think that dynamic scope is strictly better than global
| variables: all of its downsides are shared by globals, _but_
| it has the important advantage that a dynamic binding is
| scoped. Given that every popular language I am aware of
| supports globals in some form, I think that adopting dynamic
| scope would be an improvement.
|
| Even more interesting is to consider what would it would mean
| to replace global with dynamic scoping entirely. I don't
| really know if that is possible or efficient, but it
| certainly is intriguing. Almost like dependency injection
| throughout the language. Kinda sorta kinda.
| gpderetta wrote:
| Implicit parameters seems to be the principled dynamic
| scoping for statically typed languages.
|
| Implicit parameters are in the list of must-have features
| for the programming language I'll never get around to
| implement.
| samatman wrote:
| This is effectively how globals are handled in Lua: they
| are first-class environments, ordinary tables which can be
| assigned to a function (5.1/LuaJIT) or upvalue (5.2+) to
| resolve globals.
|
| I like it a lot, it's a supplement to lexical scope: like
| ordinary globals it's a flat namespace 'behind' the locals,
| unlike many implementations of globals these environments
| are replaceable and (mostly) fungible.
| skybrian wrote:
| I think Go's context objects serve a similar purpose to
| dynamic scope?
|
| https://pkg.go.dev/context
|
| It seems like pretty much anything that lets you pass in an
| environment as a function parameter would work.
| agumonkey wrote:
| I believe dynamic variables are reinvented every year.
| Even dependency injection seems like a sibling.
| CoastalCoder wrote:
| I'm surprised that dynamic vs. lexical scope is hard for
| seasoned programmers / computer-scientists to grasp.
|
| Maybe it's because the two concepts are inter-related during
| program execution?
|
| Similar to the way that latency vs. bandwidth can confuse
| some people, because they're inter-related in some
| circumstances but not others?
| Banana699 wrote:
| >there is a very real sense in which programming languages
| haven't really progressed since the early 80s.
|
| Can this be made any more precise to somebody who doesn't share
| the feeling?
|
| As a programming language aficionado, my attitude is the exact
| opposite of this. 1980 is the debut of Smalltalk.
| Standardisation of CommonLisp happened sometimes in the mid
| 1980s. The 1990s is the "Interpreters Renaissance", when Python
| and Ruby (and other less elegant siblings) came out, as well as
| the JVM and the proto-beginnings of the CLR. The 1980s and
| 1990s was an era of massive interest in Partial Evaluation, the
| revolutionary idea with consequences such as deriving a
| compiler out of an interpreter automatically. Those ideas were
| later revived\paid off in projects like PyPy and
| GraalVM+Truffle, late 2000s and early 2010s stuff. Haskell?
| 1990, and only popular beginning in the late 2000s\early 2010s.
| LLVM? early 2000s. SML? OCaml? F#? Scala? Erlang? Rust?
| Clojure? Perl and its legacy in Raku? Julia? Mathematica? I can
| go on and on.
|
| Highly skeptical of that "very real sense" that views all of
| this cambrian explosion of ideas and explorations as not really
| progress.
| time_to_smile wrote:
| The differences between Fortran, Lisp, C, Prolog, Smalltalk,
| APL and ML are wildly more extreme than Python, Java, C++,
| JavaScript and Rust. Languages like Haskell and OCaml are
| natural evolutions of ML work done in the 70s, just like
| Clojure is for Lisp.
|
| That list of languages that existed prior to 1980s is _still_
| considered a must learn set of languages for any serious
| programming aficionado. All of the more recent languages tend
| to exist in the "learn one and learn it well" category
| provided you already have experience in the pre-1980s
| languages. The major exception to this would be Haskell, but
| that's more so because it has basically taken the throne and
| the modern heir the the ML kingdom.
|
| That's not to say there hasn't been some really interesting
| work in programming languages since the 1980s, but it's very
| similar to OS world (we never really moved past the age of
| UNIX). We've largely established the major paradigms of
| programming and refined them rather than really breaking new
| ground.
| eadmund wrote:
| > > there is a very real sense in which programming languages
| haven't really progressed since the early 80s.
|
| > 1980 is the debut of Smalltalk.
|
| 1972, actually!
|
| > Standardisation of CommonLisp happened sometimes in the mid
| 1980s.
|
| Finished in the mid 90s. But Lisp was around well before the
| 80s, and CL was in a lot of ways a standardisation of the
| practices developed in the 70s & 80s.
|
| > 80s. The 1990s is the "Interpreters Renaissance", when
| Python and Ruby (and other less elegant siblings) came out
|
| I don't really count that as progress. Python is just an
| unacceptable Lisp (1958), and Ruby is by analogy an
| unacceptable Smalltalk (1972).
|
| > ... the JVM and the proto-beginnings of the CLR.
|
| The JVM is just refinement of ideas dating back to UCSD
| Pascal (1977), and the CLR is just an embraced and extended
| JVM.
|
| > The 1980s and 1990s was an era of massive interest in
| Partial Evaluation, the revolutionary idea with consequences
| such as deriving a compiler out of an interpreter
| automatically.
|
| Emanuelson & Haraldsson wrote about partial evaluation in
| Lisp in their 1980 paper _On compiling embedded languages in
| LISP_.
|
| > Haskell? 1990, and only popular beginning in the late
| 2000s\early 2010s. LLVM? early 2000s. SML? OCaml? F#? Scala?
| Erlang? Rust? Clojure? Perl and its legacy in Raku? Julia?
| Mathematica?
|
| Haskell was strongly influenced by Miranda (1985), and of
| course based on Hadley-Milner (1969/1978), albeit with
| improvements. I'll grant you that Wadler & Blott's typeclass
| proposal date to the late 80s. What about LLVM is fundamental
| progress rather than refinement of existing work? Standard ML
| dates back to the early 80s: 1983. ML itself dates to the
| early 70s. OCaml is OO ML, a marriage of a language from the
| early 80s and a concept from the early 60s (Simula dates back
| to 1962). F# is another member of the ML (1973) family.
| 'Something old, but this time on a new VM!' isn't really
| progress, is it? Scala is OO (60s) plus functional
| programming (50s). Erlang appeared in 1986, so I will grant
| that it's just outside of my 'early 80s' comment. But did it
| progress as a _language_ or as a _runtime_? Granted the
| boundaries are fuzzy. I'm not familiar with the scholarship
| which influenced Rust -- it may genuinely have a few ideas
| which don't date back to the late 70s /early 80s. I imagine
| someone here can educate me! Clojure is 'not-really-Lisp, but
| on the JVM'; is that really progress as opposed to
| refinement? Not that refinement is unimportant! Perl's origin
| in 1988 _is_ past the early 80s, but ... is there really much
| original there? Especially in its early versions it was
| basically a better awk (1977). I am not familiar with Raku,
| but it doesn't really look original, but more like a
| polishing of Perl. Julia's big thing is multiple dispatch,
| right? That dates back to the early 70s, so it can't really
| be counted as an innovation in a language from a decade ago.
| Finally, Mathematica dates to the late 80s, so definitely
| outside the window I gave, but isn't there a substantial
| overlap between its functionality on one hand and Lisp
| /Macsyma on the other?
|
| Edit: reformatted to not make this a huge string of
| paragraphs. Instead, it's a wall of text. Dunno if that is an
| improvement. Do let me know ...
| abecedarius wrote:
| > Mathematica dates to the late 80s
|
| It's pretty similar to Wolfram's early-80s math language
| which he abandoned after a rights dispute with Caltech,
| where he'd been employed at the time.
| lizmat wrote:
| > ... it doesn't really look original, but more like a
| polishing of Perl.
|
| Sure, Raku has its roots in Perl. But if multiple dispatch
| is the big thing for Julia, why wouldn't it be for Raku?
|
| Other things Raku has that Perl doesn't have: - grammars
| built in, in fact Raku itself uses a Raku grammar to parse
| itself - a sane and genuine async / multi-thread / event
| model - Rational arithmetic, big integers and complex
| numbers by default - lazy lists: say (1..Inf)[42] won't
| hang - Junctions; a == 42 | 666 | 314 - Gradual typing and
| coercions
|
| > I am not familiar with Raku, but
|
| Perhaps it's time to familiarize yourself a bit more with
| Raku :-)
| mek6800d2 wrote:
| I think you misunderstood eadmund's point about Julia and
| multiple dispatch. :( Still, your succint summary of some
| of raku's capabilities makes me, at least, want to study
| it in more detail -- so, thank you!
| infinite8s wrote:
| The Smalltalk of 1972 is not the Smalltalk of 1980.
| abecedarius wrote:
| Right; though Smalltalk-80 is very like Smalltalk-76.
| KerrAvon wrote:
| I think the point in this context is to the extent most
| might be familiar with Smalltalk, the familiarity is with
| Smalltalk-80 and descendants.
| ogogmad wrote:
| I have only a passing interest but Rust is based on ideas
| in the late '80s concerning linear logic and affine logic
| and their relationship to PLT. Monadic programming and its
| applications to for instance parser combinators is from the
| early '90s.
| Banana699 wrote:
| As with all arguments about "not really innovation", the
| problem seems to be in drawing boundaries around ideas. If
| people in the 1980s and the 1990s were all really
| reinventing Kay and Mccarthy, then why weren't Kay and
| Mccarthy reinventing Turing and Church? Why weren't Turing
| and Church reinventing Russell and Frege? Are we going,
| like 1 philosopher called the entirety of philosophy and
| its history as a "series of footnotes to Plato", to call
| the entirety of CS and its history as a series of footnotes
| to Leibnitz ?
|
| In some sense of the words, all of this is true. The human
| brain is just a finite configuration of 100 billion object
| after all, and as you zoom out to more meaningful levels of
| abstractions the set of distinguishable states only get
| smaller. Shoulders Of Giants, etc etc etc. So, off course
| whatever you have thought of is not original if you zoom
| out far enough. My only question : why stop zooming out at
| the arbitrary cutoff of the early 1980s? Lisp dates back
| from 1958, Algol from 1960, Simula from 1962, APL from
| 1964; those 4 together are essentially all of programming
| language design summarized into 4 specimen in the same way
| that [bird, fish, ape, virus] are essentially all of life
| summarized into 4 specimen, don't you agree? so
| 'essentially' all programing languages were invented before
| 1965, even more essentially before 1970. What makes this
| 'essentially' more false than your essentially, which just
| moves the cutoff 20 years later? Types are just fancy sets
| right? Sets are ancient, all of type theory is
| 'essentially' reinventing Russell and Whitehead.
|
| In essence, Computer Science was invented before it was
| born, and that's literally true in a very peculiar sense :
| CS is an outgrowth of a delightfully bizarre intersection
| between Math, Logic, Physics, Information Theory, etc...
| that was always there as long as those fields existed by
| any name or the other, it was just recognized as special in
| the 1940s-1960s and given a name.
|
| So, before I push back on your particular claims, I will
| grant you that there is indeed a very particular sense in
| which you're completely correct, but I don't find that
| sense convincing enough. You can always find the very
| special and peculiar way\interpretation\precondition in
| which any statement can be true if you're patient enough,
| there is a glimmer of truth in every statement no matter
| how bizarre. I don't mean to say your claim is bizarre or
| obviously false, it's not, but it's not obviously plausible
| or even a useful way to view things either in my view.
|
| --------
|
| >> 1980 is the debut of Smalltalk.
|
| >1972, actually!
|
| I should have clarified that by 'debut' I mean the general
| release of the Smalltalk VM to non-XeroxParc audience. This
| is an important milestone because what followed was a 2
| decades (smalltalk fans would say 4 decades and counting
| :)) of research and enhancement of all ideas surrounding
| the language and the paradigm. One particularly beautiful
| direction is Self, which invented prototypical OO, and was
| the first to articulate and implement the idea of
| "Extremely Smart Runtime-Adaptive Compiler" publicly as far
| as I know. Isn't this another hole in your claim?
|
| >Finished in the mid 90s. But Lisp was around well before
| the 80s, and CL was in a lot of ways a standardisation of
| the practices developed in the 70s & 80s.
|
| What I'm saying is, why bother stopping at 1970s? When you
| can just as easily say that Lisp was largely finished by
| 1965 ? You wouldn't be qualitatively more wrong, only very
| incrementally.
|
| >Python is just an unacceptable Lisp (1958), and Ruby is by
| analogy an unacceptable Smalltalk (1972).
|
| Lisp is just an unacceptable Lambda Calculus (1935).
|
| >The JVM is just refinement of ideas dating back to UCSD
| Pascal (1977), and the CLR is just an embraced and extended
| JVM.
|
| I will just play my same old unimaginative game and ask you
| : why stop at UCSD Pascal? Search for "Virtual Machines" on
| the ACM repository, you will find hits from the early
| 1960s. Bytecode is an idea that falls extremely naturally
| out of the idea of pseudo code, which assembly programmers
| were using extensively since the early 1950s, indeed the
| first proto-languages (Pre-Fortran!) were just bytecode
| interpreters (except the bytecode was written directly by
| the programmers, and that was a improvement over hardware
| assembly). Fortran compilers for the IBM 1401, to give a
| specific and far-from-the-only example, consisted of about
| 60 or so passes that began with human readable text and
| ended with machine executable bytes, every IR in between is
| just begging you to discover the idea of a formalized
| bytecode format and, from there, the Entire Thing of VMs.
| The Apollo project (1969 launch, software developed from
| 1965 to 1969) software engineers explicitly embedded a VM
| for vector math inside the code, and wrote a big part of
| the vector-heavy navigation code in its bytecode. An
| honest-to-God VM, that interpreted its bytecode at runtime.
| People Knew The Stuff.
|
| What did Pascal VMs do? they were the first to, publicly
| and popularly and with extensible implementations,
| evangelize and elaborate on the ideas known for 20+ years.
| In other words, Not Really Progress (^TM).
|
| >partial evaluation in Lisp in their 1980 paper On
| compiling embedded languages in LISP
|
| And the entire idea dates back to Futamora projections from
| the 1970s,that's not the point, the point is the vast
| amount of learning and research and development that
| happened since then. Like I said above, if you're only
| concerned with the raw beginnings of the idea, then types
| are just a trivial footnote to Sets and (more generally)
| Category Theory, both extensively elaborated on from the
| 1900s to 1950s. Sets, in particular, are "just" a kinky way
| of denoting natural numbers, sooo... Do you see where this
| one is going?
|
| >Haskell was strongly influenced by Miranda
|
| I think you're familiar enough with my "But Why Stop
| There?" game that you can play it yourself here, but I will
| just note that Haskell's incorporation of Lazy Eval into
| the entirety of the language was something unprecedented as
| far as I know, so there's that, 1 ground-shaking idea and
| all of its myriad consequences on the language and its
| design.
|
| >F# is another member of the ML (1973) family. 'Something
| old, but this time on a new VM!' isn't really progress, is
| it?
|
| >Scala is OO (60s) plus functional programming (50s)
|
| ><other languages>
|
| All of those have a ton of new things to offer. As
| examples, F# is far from a generic ML clone, just look at
| active patterns and type providers. Scala gives people who
| venture too deep inside it ptsd, although I haven't
| experienced it myself. Julia and Mathematica have macros
| without being as braindead as Lisp on syntax, which is a
| massive improvement and refutes a point that Lisp fans love
| repeating till today. They aren't the only ones, Elixir
| (which wouldn't have been possible without Erlang) is
| another, among many others. Perl and Mathematica's legacy
| is Extreme Syntactic Flexibility, Mathematics goes as far
| as making diagrams and images first-class-syntax, all the
| while representing Syntax as nested lists underneath. Raku
| is the first general purpose language to give BNF grammars
| full first class's citizen status.
|
| Like I said, it really seems to me you're doing "All of
| life is just birds, viruses, fishes and mammals, what's new
| here? Biology is done".
| mncharity wrote:
| Triumphal progress! Tragic stasis. Perhaps an issue of
| interpretive context?
|
| Say you're handed a glass. Do you call it half full or half
| empty? :) What if the level is because the hander (history?
| society? our profession?) always spills it? And keeps
| forgetting how to work the cooler? And the sink is turned off
| because the they're too cheap or can't be bothered to call a
| plumber? There's water in hand: joy! And sadness.
|
| If someone jumps in a hole, do we count clawing their head
| back above ground as progress? If wheelbarrows are known, is
| a towering hand-carried mound inspirational or depressing? If
| an entire well-coordinated workforce has dysentery, and
| management has for years been refusing requests to fund more
| research on latrine placement, is it well managed?
|
| Hole... A room of java swing devs stand in awed ovation for a
| rails demo - yay progress! And Java N+1 adds a long-looked-
| forward-to feature X! And yet... Java was originally,
| intentionally, feature stripped to cripple and down-skill
| programming for industry. "Java N+1 - Now with less cripple!"
| Jumping in the hole was even arguably the right call. And at
| least java meant my never having to do the "gc can be
| helpful" conversation ever again, for which I remain
| grateful. So yay progress, but of a wistful sort.
|
| Aside... So much of these last decades seem a massive
| exercise in programmer education. Boston Python meetups with
| pre-covid hundreds, were once a half-filled little conference
| room. I once saw a project fail in part because highly paid
| and professional devs had trouble with the concept of hash
| table. Our vastly expanded community and resources and
| competence are greatly empowering. But so much of that
| education seems... _remedial_? And the pace of progress
| seemed to stall as focus and expertise was dispersed. So
| awesome progress! So little progress.
|
| Wheelbarrows... An recent hn comment asked which language to
| best teach programming with. And I thought, maybe use
| multiple languages? Maybe for each concept, first show it in
| a language which handles it less badly, before getting bogged
| down in common cruftiness? A concatenative language for
| Tufte-esque no wasted ink. APL/J for eyeball abstraction.
| PureScript or an alternate prelude rather than haskell's
| historical cruft. On and on. And yet... how demotivating
| might students find that? To again and again see beauty and
| power, and then be told "and... you won't be allowed to live
| most of this... for years and decades, and much never before
| you retire". When each new thing is still so suboptimal in so
| many ways, it gets harder to see it as progress.
|
| Hole... Picture an industry researcher giving a talk,
| describing the creation of a massive tool to analyze and find
| bugs in large C++ programs. And in Q&A is asked, "So, by dint
| of this extraordinary effort, you've managed to almost but
| not quite make up for C++ having such a poor type system?"
| And with a wistful smile, nodding yes. Progress? Or symptom
| of lack of progress?
|
| Dysentery... A researcher with fun ideas on garbage
| collection burns years trying to frame it as security benefit
| in forlorn hope of DHS funding, having found none elsewhere.
| A researcher mentions having found a PhD student to explore a
| foundational functional programming idea, unstalling it after
| more than a decade of limited attention. A single person is
| the clear expert on X and keeps meaning to write it up, so
| let's fund that to avoid the waste of years of slower
| dispersion... woops, nope, politicians slashed our budget.
| The industrial policy of the United States is that critical
| bottlenecks in software progress, foundational for the
| economy and societal progress, are best pursued by
| individuals doing hobby projects in their spare time, and by
| research slow-mo stretched across decades. Progress and its
| absence, is judged in part, against what might have, could
| have been.
|
| So the progress of these last decades has been gloriously
| breathtaking. And it's a pity that we as a profession have
| been so incompetently dysfunctional that the needle has
| barely moved in two human generations.
| rileyphone wrote:
| > The industrial policy of the United States is that
| critical bottlenecks in software progress, foundational for
| the economy and societal progress, are best pursued by
| individuals doing hobby projects in their spare time, and
| by research slow-mo stretched across decades.
|
| Meanwhile VC pours funds into 'no-code' tools that
| represent unprincipled grasping in this critical problem
| space. Of course at some level of software sophistication
| you are going to need 'code', that is a medium of
| programming, and there isn't any real reason that a
| programming language in the tradition of Lisp or Smalltalk
| could provide those same end-user facilities, just stymied
| progress from lack of DARPA level follow ups and rapidly
| shifting business models. I blame, in equal parts, Java,
| the Web, and Ronald Reagan for where we are today, though
| that is no excuse for us hobbyists to keep trying to solve
| critical software bottlenecks. Who knows, maybe the next
| Big Thing in programming languages won't be yet another
| weekend project gone supercritical with a cranky BDFL, last
| second design, and toxic community - maybe it will get out
| of its own way and help people build the software that they
| need.
| rjsw wrote:
| Looks more like keeping reinventing the wheel than a cambrian
| explosion of ideas to me.
| abecedarius wrote:
| ML existed in the early 80s. So did early lazy functional
| languages (KRC). I could go on in this vein if the "very real
| sense" he means is about finding basic ideas vs. developing
| and spreading them. (I get the impression Rust passes this
| test, though. E.g. Wadler's "Linear types can change the
| world" is from 1990, Baker's "Lively linear Lisp" from 1992,
| and after that I stopped looking.)
|
| I haven't read the OP yet.
| Upvoter33 wrote:
| It is interesting how PL used to be central to AI (I remember the
| strong association of AI people and Lisp from my era). Of course,
| neural networks and deep learning changed the focal points of the
| area - in a good way - which makes a lot of the old work seem
| interesting perhaps only historically. Times do change, I
| suppose...
| bodhiandphysics wrote:
| Language design used to be central to everyone! It was one of
| the major core skills of programming computers. Consider for
| instance how when bell labs had a difficult programming job to
| do (like write an operating system!), they tended to eventually
| create (multiple) languages.
| runevault wrote:
| When you have to express things in code, what code allows you
| to express at various difficulty levels matters. When things
| like models come into the picture and change how you can
| express the problem and/or solution it changes what matters in
| a given scenario.
| PaulHoule wrote:
| I found that book a little disappointing. Some other books focus
| on macros and metaprogramming techniques that are unique to LISP
| (say Graham's "On LISP"), but despite all the differences between
| CLOS and Python all I could see was "look, I can draw a line
| between this function in the metaobject protocol and that
| function in the Python implementation of objects and get
| something very close to an isomorphism".
|
| "Meta objects" are significant in many areas of programming where
| you embed objects from system A inside system B (as opposed to
| translate native A objects into native B objects). This ranges
| from "script a Java object in Python" to systems like COM and
| CORBA as well as UML, MOF, object-relational and object-MOF
| mappers, etc. Alternative "meta object" systems can expand the
| capabilities of a single object system.
|
| I'm not sure if anything really good has been written about that
| topic, I think people aren't so interested in "objects in
| general" today but alternately we have had 30+ years experience
| with real-life object systems that can be consolidated.
| EdwardCoffin wrote:
| I think the problem with explaining the MOP is that it really
| only pays off with systems that are big enough and complicated
| enough that you really can't use them as examples in a book.
| You can provide contrived examples but then some readers just
| nitpick on how contrived they are.
| gjvc wrote:
| Excellent observation. I would like to see an edition of this
| book addressing a real project.
|
| Note that the only way to really learn a language is "have a
| project to implement in that language", not just read
| examples.
| EdwardCoffin wrote:
| I feel compelled to note that I was cribbing my observation
| from a similar one made by Guy Steele, who ran into similar
| trouble finding a non-trivial example to justify the
| necessity of tail calls in an object oriented language [1]:
|
| > (someone's criticism): The objection to TCO that you're
| trying to answer is that it's rarely helpful in the real
| world. So you have to find real-world examples to refute
| that. An impractically-slow set isn't a real-world example.
|
| > (GLS response): I wish I could fit a real-world example
| onto a blog page. The best I can do is to exhibit a "toy"
| example that illustrates the basic idea.
|
| [1] https://web.archive.org/web/20110716163344/http://proje
| ctfor...
| lispm wrote:
| > look, I can draw a line between this function in the
| metaobject protocol and that function in the Python
| implementation of objects and get something very close to an
| isomorphism
|
| The book (1991) and the CLOS MOP (1987/88) itself are now quite
| old and other languages had (Smalltalk) and have (Python)
| something similar.
|
| https://ml.cddddr.org/cl-object-oriented-programming/mop/198...
|
| One thing, not related to its main content, I always liked
| about the AMOP is the generally good style of Lisp programming
| in the book.
| JHonaker wrote:
| I mean, the systems described in the TAotMOP, and CLOS, are
| early implementations of multiple dispatch, which is the core
| paradigm of the Julia language.
|
| Julia doesn't go quite as far as implementing a MOP. It doesn't
| allow for the arbitrary modification of the semantics of
| evaluation in the language itself, but I'll bet it'd be pretty
| easy to write a MOP on top of the base language.
|
| I think the ideas in the MOP are exactly those that you've
| identified in the first sentence. You can draw lines that allow
| you compare the object implementations of particular
| programming languages. This isn't so useful when you're just
| working in a single language, but in something like CL or
| Scheme (or Racket more specifically) where a major design
| strategy is to design a DSL that suits your problem, having a
| MOP really lets you tailor that DSL to the problem in ways you
| might have a hard time doing otherwise.
|
| For example, I once wrote a library for a computing in
| particular area of mathematics, but I also wanted to see some
| symbolic equations. I just changed the basic evaluation
| semantics of my numeric operators to operate on symbols as well
| as numbers and to run some simplification logic and boom. I had
| a rough view of the equations that were actually being used to
| compute that I could use to debug or tidy up and write down.
___________________________________________________________________
(page generated 2022-08-15 23:01 UTC)