Post 9fdPdB77KA2ooDJpui by vertigo@mastodon.social
 (DIR) More posts by vertigo@mastodon.social
 (DIR) Post #9fc7ITR4164DDbfOGO by fribbledom@mastodon.social
       2019-02-08T02:57:33Z
       
       1 likes, 0 repeats
       
       For nostalgia's sake I just loaded up GW-BASIC in DOSEMU.What a flashback to 1989: it was the second programming environment I've ever used in my life, and - from today's perspective - it's quite possibly the worst introduction to programming one could imagine.... and yet I absolutely loved everything about it!From then on, childlike curiosity took the helm. I was able to pour all my energy & creativity into little programs and it was the most liberating thing I've experienced all my life.
       
 (DIR) Post #9fc7ITmgihMeIfmey8 by vertigo@mastodon.social
       2019-02-08T04:27:28Z
       
       0 likes, 0 repeats
       
       @fribbledom I disagree that BASIC (in any form) was the worst way to introduce programming.  An entire generation grew up on BASIC and moved successfully onto other, more powerful languages later on.  I count myself amongst them.
       
 (DIR) Post #9fc7IU4ldTpHCkF69I by kragen@nerdculture.de
       2019-02-08T04:29:24Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom I feel like BASIC led me to emphasize rote memorization of patterns and APIs over learning to build composable abstractions
       
 (DIR) Post #9fc7ZaxhHrb9U45vvs by vertigo@mastodon.social
       2019-02-08T04:32:30Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom Sure; idioms are a part of all languages though; some more than others.Although BASIC from K&K had the facilities for that on mainframes by the time GW-BASIC was popular, it had not appeared on home computers in any capacity until VisualBasic.  And by then, you might as well have learned Pascal or C, which pretty much doomed BASIC either way as anything except a RAD tool or scripting language.
       
 (DIR) Post #9fc7i3Gbvw4xaSyRA8 by loke@functional.cafe
       2019-02-08T04:34:01Z
       
       0 likes, 0 repeats
       
       @kragen @vertigo @fribbledom It has a programming environment that is based on a REPL, which makes it very easy to experiment. That's the best way to become familiar with the concepts.There are of course better languages with a REPL, but most of them didn't run on the old 8-bit computers of the time.
       
 (DIR) Post #9fc8b4Ce9efGPwLw36 by vertigo@mastodon.social
       2019-02-08T04:35:12Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom That said, part of why I want Forth for the default language in the #Kestrel3 is because it is both more powerful and more expressive than BASIC, albeit at the expense of fewer built-in data types.I am considering also offering a Lisp environment for the computer as well, at least as an option which can be loaded from storage.
       
 (DIR) Post #9fc8b4iCGLtxzn78E4 by kragen@nerdculture.de
       2019-02-08T04:43:59Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom A thing BASIC has over Forth is that it's a lot harder to crash the machine; array indexing is bounds-checked, etc.
       
 (DIR) Post #9fc8f9r94uO76oFLnc by vertigo@mastodon.social
       2019-02-08T04:44:44Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom and is garbage collected and statically type-checked.
       
 (DIR) Post #9fc8wqXDeZGcOgMQXg by kragen@nerdculture.de
       2019-02-08T04:47:56Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom yeah, although not in a very powerful way
       
 (DIR) Post #9fc973jrPsUqFC8xd2 by vertigo@mastodon.social
       2019-02-08T04:49:46Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom It was sufficient, and it's a lot better than what Forth gives you (which is strictly on par with what C gives without the benefit of a standard library to help).
       
 (DIR) Post #9fc9TwgNohrN1KQiiO by kragen@nerdculture.de
       2019-02-08T04:53:54Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom C has more static type checking than Forth, and you can't accidentally pass an argument to the wrong function in C because you were confused about arity. I wonder if you could usefully use FORGET/MARKER in Forth to build up a big data structure at the end of the dictionary and then reset it (as opposed to just for reloading code you changed)
       
 (DIR) Post #9fc9h7N9NqvFwGiOfI by vertigo@mastodon.social
       2019-02-08T04:56:17Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom Forth has *zero* type checking; everything in Forth is a machine word (in this sense, it's like BCPL).Passing wrong args to functions in C was actually a real thing back before ANSI came along; it took ANSI to clean that semantic up.Using FORGET?  Not so much; however, you can with MARKER.  Better still, use positive and negative offsets to ALLOT.  I used that extensively in my blog software when constructing strings and expanding templates.
       
 (DIR) Post #9fcCHQOuJWuLHsydma by vertigo@mastodon.social
       2019-02-08T04:57:10Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom (also, type-checking in C *requires* ANSI-style function declarations; if you continue to define a function in K&R style, type checking is not enforced.)
       
 (DIR) Post #9fcCHQgHGwno9l6VrE by kragen@nerdculture.de
       2019-02-08T05:25:14Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom You still get *some* type-checking in C even without prototypes.
       
 (DIR) Post #9fcCVvE0VVMIwPn86a by kragen@nerdculture.de
       2019-02-08T05:27:52Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom Hmm, I never thought about using ALLOT with a negative offset!By "passing args to the wrong function" I mean you say "FOO BAR BAZ QUUX WORF" and you hope it parses as WORF(BAZ(FOO, BAR), QUUX) but you forgot that QUUX takes an argument and so what you actually have is something like WORF(37, QUUX(BAZ(FOO, BAR))). That can happen in PostScript or REBOL or Forth but not in BCPL or C (even K&R C) or Lisp.
       
 (DIR) Post #9fcCz7wwHAlonujZQm by vertigo@mastodon.social
       2019-02-08T04:51:01Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom Also, remember BASIC on 8-bit and 16-bit home computers were not the state of the art; Kemeny and Kurtz were advancing BASIC all along in the background, unnoticed by the majority.
       
 (DIR) Post #9fcCz869iu8vGV2vnU by vertigo@mastodon.social
       2019-02-08T04:53:57Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom Honestly, the single biggest feature I miss from BASIC was that statements had to fit on a single (virtual) line of text, which made integrating the command prompt ("READY") and the full-screen editor a breeze (as found in the Atari and Commodore 8-bits; I'm not sure if Acorn had something similar or not).My first exposure to BASIC as a proper, structured programming language (but still not state of the art) was with AmigaBASIC.
       
 (DIR) Post #9fcCz8ID05mfrsgYaG by kragen@nerdculture.de
       2019-02-08T05:33:09Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom I wonder why LOGO didn't crush BASIC. It had full-screen editors, garbage collection, and subroutines with PARAMETERS
       
 (DIR) Post #9fcDmZFLZjfFeZv39U by vertigo@mastodon.social
       2019-02-08T04:37:03Z
       
       0 likes, 0 repeats
       
       @loke @kragen @fribbledom Also, a lot of the earlier 8-bit computers had superior editing facilities.With BASIC being line-oriented, it was a simple matter to make a full-screen editor.  The number one issue I have with Python or Lisp, for example, is that I just cannot cursor up to a line, fix it, then have the computer re-evaluate that expression for me.You can kinda sorta regain that experience in Emacs with C-x C-e , but you have to remember to invoke it.  It's still not fully automatic.
       
 (DIR) Post #9fcDmZQKusSGCf3pHU by kragen@nerdculture.de
       2019-02-08T05:42:06Z
       
       0 likes, 0 repeats
       
       @vertigo @loke @fribbledom To get that experience at that level practically, I think you need a side-effect-free language, like Excel formulas.
       
 (DIR) Post #9fcEJF2Hjt1uWlOE4W by vertigo@mastodon.social
       2019-02-08T05:47:58Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom It actually can happen in BCPL (and it also happens in Javascript too!).  It's part of the BCPL language as Dr. Martin Richards' current language definition that all functions are variadic.If you call FOO(1,2,3,4), but your code has: LET FOO(a,b) BE ... then you can access a and b (1 and 2), but not 3 or 4.  Likewise, given LET FOO(a,b,c,d) BE ... and you invoke it as FOO(1,2), then c and d are undefined.
       
 (DIR) Post #9fcEUGPGz0FDxqOjE8 by vertigo@mastodon.social
       2019-02-08T05:49:59Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom and it was a proper Lisp, albeit with somewhat odd syntax.I'm convinced that whoever made computing history just wasn't interested in doing things right.  BASIC had the benefit of marketing behind it (it's baked right into the name, even).  That's all I got.  :(
       
 (DIR) Post #9fcEcVJbRAKuWEfupk by djsumdog@hitchhiker.social
       2019-02-08T05:51:27Z
       
       0 likes, 0 repeats
       
       @fribbledom There was a blog post way back about how BASIC was really good for kids; better than say Python. Without any weird imports, you can draw things, do basic loops, change colors. The 8-Bit guy talks about how almost every ancient machine available has BASIC. If you know sh/bash and basic, you can use any machine in The Living Computer Museum (or other similar museums where you can play with old hardware).
       
 (DIR) Post #9fcFT29caUbXQS3Qxc by drewcassidy@mastodon.social
       2019-02-08T03:03:33Z
       
       1 likes, 0 repeats
       
       @fribbledom my intro to programming was writing FORTH for computers in a minecraft mod. It could be worse :P
       
 (DIR) Post #9fcbkL2foWQjv6Sea8 by fribbledom@mastodon.social
       2019-02-08T10:10:32Z
       
       0 likes, 0 repeats
       
       @djsumdogI think that is a bit of a myth to be honest. Yes, they all share the same "BASIC" expressions, but you couldn't just port one BASIC code to another system. Especially when accessing hardware & graphics features (PEEK/POKE) you quickly ended up being very, very platform-specific.Also M-BASIC, CBASIC & GW-BASIC all sounded the same to me as a kid, but I quickly learned that's not the case.
       
 (DIR) Post #9fdBst5chr3eBQDVHE by vertigo@mastodon.social
       2019-02-08T05:53:42Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom What BCPL does successfully prevent, though, is stack imbalance.  I think this, perhaps, is what you're thinking about?  Stack imbalance is definitely an issue in Forth.
       
 (DIR) Post #9fdBstJnr8OstOqpNY by kragen@nerdculture.de
       2019-02-08T16:55:31Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom I'm not sure what you mean by stack imbalance? What I meant was that in my example I meant to pass the return value of BAZ to WORF but inadvertently passed it to QUUX instead, and sucked in au unrelated 37 from somewhere else (which will eventually create an underflow if there isn't a countervailing extra stack item somewhere else); is that what you meant?
       
 (DIR) Post #9fdCDMc6eA3hXeAraa by vertigo@mastodon.social
       2019-02-08T16:59:14Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom Yes; a stack imbalance is when the stack pointer at run-time no longer matches your expectations at compile-time, and so you end up destroying parameters intended for future consumption (underflow) or you leave garbage which other then fed to future computations (overflow).Either condition is difficult to debug, and in large part is what motivated me to work on my Declarative, Inquisitive, then Imperative (DItI)  coding style.
       
 (DIR) Post #9fdCKjIkgz7PvxdMMy by kragen@nerdculture.de
       2019-02-08T17:00:34Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom Hmm, I figured that if I spent more time programming in Forth I'd get used to it. But hey, I wasted a couple of hours debugging an uninitialized data bug in C, so maybe that doesn't happen.
       
 (DIR) Post #9fdCdtITkkmyR8dusa by vertigo@mastodon.social
       2019-02-08T17:04:02Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom For me, programming in Forth is a lot like practicing martial arts under traditional style instruction.  I had to spend many years balancing bowls (stacks?) on my knee while standing on one leg before I achieved any kind of enlightenment.
       
 (DIR) Post #9fdCrEsunmK79d3BzM by vertigo@mastodon.social
       2019-02-08T17:04:11Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom Put another way, knowing that I'll never have the benefit of a type-checked Forth environment at my disposal, I had to constantly ask myself, "What can I do to not only prevent imbalances, but *habituate* writing software so that they rarely happen again?"  I had to take an active role in refining not just what code I write, but how as well.
       
 (DIR) Post #9fdCrF7Rvjwvshqndw by kragen@nerdculture.de
       2019-02-08T17:06:26Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom Yeah, my best approach there is to use more variables, shorter words, and very little stack manipulation — I'm suspicious even of DUP
       
 (DIR) Post #9fdDnYOSQWWYSIbvXc by vertigo@mastodon.social
       2019-02-08T17:16:58Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom I tend to use DUP, DROP, OVER, and NIP quite liberally.  I find that SWAP and (-)ROT are the rarest stack permutation words I use.  I'm surprised SWAP has become relatively rare, honestly.  Usually, I use (-)ROT only when working with pairs of buffers (e.g., as when parsing, copying data, etc.).
       
 (DIR) Post #9fdDyZiFrVBdjU7siu by kragen@nerdculture.de
       2019-02-08T17:18:59Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom No TUCK or R@? I used to use them more and then I found that my code was less buggy (though longer) if I kept most of my variables in VARIABLEs instead of on the stack :)
       
 (DIR) Post #9fdEFOwV9tJbFhoUEa by kragen@nerdculture.de
       2019-02-08T17:22:00Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom I mean I used to use DUP, OVER, and SWAP (and even ROT!) more frequently. It's possible that at some point I'll veer back in that direction. It's just dangerously tempting to me to keep four variables on the stack because I think I can remember where they are, but I can't
       
 (DIR) Post #9fdFaVrMjg1sk02e92 by vertigo@mastodon.social
       2019-02-08T17:37:02Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom I occasionally use R@, but I've never used TUCK.
       
 (DIR) Post #9fdFh8IUjrLRgrt6Lw by vertigo@mastodon.social
       2019-02-08T17:38:13Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom Yep; ideally, one should never have more than three items in an active computation on the stack at once.  (That's not to say that you should be limited to a stack depth of 3; rather only the top 3 are what any word should be primarily concerned about.)
       
 (DIR) Post #9fdGHqryUu5KLzzZdw by kragen@nerdculture.de
       2019-02-08T17:44:52Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom I feel like the top of the return stack is a less problematic place for a temp variable than three slots down the operand stack.
       
 (DIR) Post #9fdGaBEbWHB4j9JgdE by vertigo@mastodon.social
       2019-02-08T17:48:10Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom It can be.  Until it's not.  ;)  I've had my fair share of accidentally unbalanced return stack issues as well.  That tended to happen before I learned of Moore's rule of thumb that a colon definition should not exceed two lines under ideal circumstances.
       
 (DIR) Post #9fdH1Y8wSByZb0icuO by vertigo@mastodon.social
       2019-02-08T17:49:57Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom But, of course, now I'm making heavy use of R-stack manipulation to exit whole blocks of words at a time in some cases.  It's hard to explain here, but will become more apparent in the ascetic programming presentation why.  And, after that, I've been thinking of writing my ideas down as a self-published book on ascetic programming, so hopefully I can go into very rich detail there.  (Hopefully, I'll eventually learn how to explain it by then.)
       
 (DIR) Post #9fdH1YikJ4cFO3TDiS by kragen@nerdculture.de
       2019-02-08T17:53:07Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom Yeah, nonlocal exits are a very useful technique, both for error handling and for recursive search
       
 (DIR) Post #9fdH2bq6CI7TLgPVhY by vertigo@mastodon.social
       2019-02-08T17:50:43Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom With some clever thinking, though, one could use TRY/CATCH as an alternative control-flow mechanism here.
       
 (DIR) Post #9fdH2byxfLCznAYaW0 by kragen@nerdculture.de
       2019-02-08T17:53:19Z
       
       0 likes, 0 repeats
       
       @vertigo @fribbledom Or call/cc ;)
       
 (DIR) Post #9fdHKccE39iwy7Vhei by freakazoid@retro.social
       2019-02-08T17:56:26Z
       
       0 likes, 0 repeats
       
       @kragen @fribbledom @vertigo Please don't do full call/cc. Delimited and/or single-shot continuations please.
       
 (DIR) Post #9fdHNyylMLLFCFedf6 by fribbledom@mastodon.social
       2019-02-08T17:53:51Z
       
       0 likes, 0 repeats
       
       @vertigo @kragen I'm afraid I I can't contribute much to the conversation... however I find it amazing how this super interesting thread unfolded from my original toot 😊
       
 (DIR) Post #9fdHNzHCFo5S7QHMOW by kragen@nerdculture.de
       2019-02-08T17:57:09Z
       
       0 likes, 0 repeats
       
       @fribbledom @vertigo I'm not really contributing that much either, since my knowledge of Forth is pretty shallow, but I'm enjoying learning.
       
 (DIR) Post #9fdMZqAQRmaB73aDmy by vertigo@mastodon.social
       2019-02-08T17:59:34Z
       
       0 likes, 0 repeats
       
       @freakazoid @fribbledom @kragen Forth isn't capable of full continuations.  All you have access to are partial continuations, since you're only manipulating return addresses.
       
 (DIR) Post #9fdMZqQjT9ctvdDFCq by kragen@nerdculture.de
       2019-02-08T18:55:20Z
       
       0 likes, 0 repeats
       
       @vertigo @freakazoid @fribbledom You could do full continuations in Forth, but it would be a pretty big change to the language. You could get pretty far just by copying the data and return stacks somewhere else.
       
 (DIR) Post #9fdMnMu82h48PTPYMy by vertigo@mastodon.social
       2019-02-08T18:07:02Z
       
       0 likes, 0 repeats
       
       @freakazoid @fribbledom @kragen I just read up on delimited continuations on Wikipedia, and I'm every bit as hopelessly lost with them as I am with full continuations.I'm quickly coming to the conclusion that every concept that Scheme touches degenerates into meaninglessness.  ;)
       
 (DIR) Post #9fdMnNCYw9oLKe2H6O by kragen@nerdculture.de
       2019-02-08T18:57:47Z
       
       0 likes, 0 repeats
       
       @vertigo @freakazoid @fribbledom Scheme starts entirely from the user view of the system; you're trying to understand it from the implementation perspective, and the most popular sources don't discuss implementation at all. Maybe reading Olin Shivers's dissertation or Appel's "Compiling with Continuations" would be more useful.
       
 (DIR) Post #9fdNbl44UNFFoy5baS by vertigo@mastodon.social
       2019-02-08T19:06:53Z
       
       0 likes, 1 repeats
       
       @kragen @freakazoid @fribbledom Considering how many words go into explaining continuations from a user's perspective, I feel justified in my assessment.I'm actually not considering implementation issues at all at this level of abstraction.  I get full continuations, but had to spend almost a decade of on/off study before I finally realized they were just garbage-collected longjmp buffers.
       
 (DIR) Post #9fdO4fkZKM0vjhkqcS by vertigo@mastodon.social
       2019-02-08T19:07:33Z
       
       0 likes, 0 repeats
       
       @kragen @freakazoid @fribbledom I think if more people invested their implementation methods, they'd actually be easier to not only explain, but to find viable applications for.
       
 (DIR) Post #9fdO4fwcbXegL5OTPE by kragen@nerdculture.de
       2019-02-08T19:12:07Z
       
       0 likes, 0 repeats
       
       @vertigo @freakazoid @fribbledom You mean, it's hard to imagine, from the generic explanation of continuations, how to apply them to the concrete things they're (arguably) good for, like backtracking search, multithreading, and error handling?
       
 (DIR) Post #9fdOL0OGM54uNs6pzU by vertigo@mastodon.social
       2019-02-08T19:14:55Z
       
       0 likes, 0 repeats
       
       @kragen @freakazoid @fribbledom Starting from a platform of zero-knowledge, yes.  Once you realize that they're a proper subset of co-routines, then it all starts making more sense.  From there, you can then realize, "Oh, wait, if they're subsets of raw co-routines, then co-routines can be implemented with continuations," and *now* the meaning of the formal definition flourishes into a fuller understanding.
       
 (DIR) Post #9fdOOLjhRsDnSIjNdw by kragen@nerdculture.de
       2019-02-08T19:15:41Z
       
       0 likes, 0 repeats
       
       @vertigo @freakazoid @fribbledom I think you mean "superset"?
       
 (DIR) Post #9fdOQXIRRFE5t91X2O by vertigo@mastodon.social
       2019-02-08T19:16:05Z
       
       0 likes, 0 repeats
       
       @kragen @freakazoid @fribbledom Nope.  Subset.  You can happily crash a computer with unconstrained coroutine creation.  You cannot even express the concept with continuations.
       
 (DIR) Post #9fdOXgXfShd2aeYRm4 by kragen@nerdculture.de
       2019-02-08T19:17:22Z
       
       0 likes, 0 repeats
       
       @vertigo @freakazoid @fribbledom Coroutines don't let you return from the same subroutine twice without reentering it, while unrestricted continuations do.
       
 (DIR) Post #9fdOiwYsF5KhSWUGYq by vertigo@mastodon.social
       2019-02-08T19:19:23Z
       
       0 likes, 0 repeats
       
       @kragen @freakazoid @fribbledom I disagree with this.  But I think we're getting into pedantics.If you have two task control blocks set up to identical values, and arrange control flow such that when one task terminates the other automatically resumes, that's returning from the same subroutine twice.  Each task needs its own (logical) copy of the stack(s) it needs to do its job.  So, yeah, technically continuations, but also coroutines.
       
 (DIR) Post #9fdPEkGHVaBkZuxGMa by kragen@nerdculture.de
       2019-02-08T19:25:08Z
       
       0 likes, 0 repeats
       
       @vertigo @freakazoid @fribbledom You mean, you make a copy (or lazy copy) of the stack of one task to create a new task? The coroutine facilities I'm familiar with (Lua's, Python's) don't let you do that, which might account for my confusion. What can coroutines do that continuations can't? Other than crashing, which you can do with anything that allocates memory.(Or feel free not to respond if you aren't interested in talking about it.)
       
 (DIR) Post #9fdPSiQV80vQwOOeWW by vertigo@mastodon.social
       2019-02-08T19:27:40Z
       
       0 likes, 0 repeats
       
       @kragen @freakazoid @fribbledom You have raw access to CPU registers in raw, fully-exposed coroutines.  So, for example, you can use coroutines as a mechanism for handling CPU interrupts (some versions of the Tripos operating system do this, for example), system calls, etc.Since the registers are exposed, you can also use them across language boundaries, which IIRC you cannot do with continuations, as continuations rely on garbage collection.  Which leads to another trait: they can be reused.
       
 (DIR) Post #9fdPdB77KA2ooDJpui by vertigo@mastodon.social
       2019-02-08T19:29:33Z
       
       0 likes, 0 repeats
       
       @kragen @freakazoid @fribbledom That's obviously because they don't want you to crash the runtime environment.  Look at coroutine libraries for lower-level languages, and you'll see that they expose more details to the programmer.  Regarding stacks, it's not usually copied on your behalf, but if you have access to the stack pointer, you can manually copy a stack prior to binding it to a coroutine.
       
 (DIR) Post #9fdULU0gpSGxyOh6Uy by akkartik@mastodon.social
       2019-02-08T20:22:22Z
       
       0 likes, 0 repeats
       
       @vertigo I built coroutines out of delimited continuations at http://akkartik.name/coroutines-in-mu, so that's an existence proof that continuations subsume coroutines. It's possible the opposite is also true; I haven't been able to find a definitive answer.(My article above may also in all humility help low-level programmers understand continuations. I'd be curious of your experiences if you happen to spend some time with it.)@kragen @freakazoid @fribbledom
       
 (DIR) Post #9feB9DorUoqYJQYspU by freakazoid@retro.social
       2019-02-08T20:27:20Z
       
       0 likes, 0 repeats
       
       @akkartik @fribbledom @kragen @vertigo Having a primitive *return* the continuation is an approach I haven't seen before. Is that really a delimited continuation? My understanding is that delimited continuations aren't supposed to be able to escape the delimited region of the stack so that they can't affect the control flow outside it.
       
 (DIR) Post #9feB9E1cjN3Sx0X4im by akkartik@mastodon.social
       2019-02-08T20:35:12Z
       
       0 likes, 0 repeats
       
       @freakazoid Yeah, I'm not certain I have the full power of delimited continuations in my implementation. I spent a while trying to build `amb` out of them and couldn't do it.I know you can return continuations outside of their delimiters. That's the only way to call them multiple times. Why would it matter whether that happens in a primitive or user code?@vertigo @kragen @fribbledom
       
 (DIR) Post #9feB9EIdi6fLnmUfFA by rain@niu.moe
       2019-02-08T20:44:09Z
       
       0 likes, 0 repeats
       
       @akkartik @freakazoid @vertigo @kragen @fribbledom (define (amb lst) (shift k (for-each k lst)))(define (assert b) (unless b (amb '())))from matt might's page
       
 (DIR) Post #9feB9ErjbcjrYcugwi by freakazoid@retro.social
       2019-02-08T20:48:07Z
       
       0 likes, 0 repeats
       
       @rain @fribbledom @kragen @vertigo @akkartik Oh! Right! You can call them multiple times, because unlike full continuations, delimited continuations return.
       
 (DIR) Post #9feB9FVRE0UvXlUOpc by freakazoid@retro.social
       2019-02-08T20:53:19Z
       
       0 likes, 0 repeats
       
       @akkartik @vertigo @kragen @fribbledom @rain And I realize I do know the difference between them: even though you can call a coroutine multiple times, you're always entering the coroutine at the same place it last yielded. Whereas with a continuation, you'd always be entering at the same place, with the same state, i.e. local variables, call stack, etc.
       
 (DIR) Post #9feB9FyVTvkYzv5c8m by kragen@nerdculture.de
       2019-02-09T04:21:57Z
       
       0 likes, 0 repeats
       
       @freakazoid @rain @fribbledom @vertigo @akkartik That's why I was saying I thought continuations were more general: they let you reenter some code anywhere, while coroutines limit you to reentering it at the same place it last yielded. But maybe there's something going on I don't understand! Multiple definitions of "coroutine" for example
       
 (DIR) Post #9feElNf9e91vhwuNE0 by vertigo@mastodon.social
       2019-02-09T05:02:30Z
       
       0 likes, 0 repeats
       
       @kragen @freakazoid @rain @fribbledom @akkartik This is possible; perhaps I'm misunderstanding something as well.  It appears that "fiber" is a contemporary term for coroutine-with-visible-processor-state, so maybe I'm recalling information from a time before "fiber" became popular.  If so, the mistake is definitely mine, and I apologize for that error.
       
 (DIR) Post #9feFNIHM6zus0wr41w by kragen@nerdculture.de
       2019-02-09T05:09:21Z
       
       0 likes, 0 repeats
       
       @vertigo @freakazoid @rain @fribbledom @akkartik I think "fiber" is a Win32-specific term; what do you mean "with visible processor state"?
       
 (DIR) Post #9feHdh3VxaCfOqKQVs by vertigo@mastodon.social
       2019-02-09T05:34:43Z
       
       0 likes, 0 repeats
       
       @kragen @freakazoid @rain @fribbledom @akkartik  CPU registers comprising the operating state of the coroutine/fiber.If you study the source code to Cintpos (the most recent incarnation of the Tripos operating system), you'll find regular application of coroutines, and code to manipulate registers thereof.  Tripos makes no attempt to hide, abstract behind any language boundaries, or otherwise simplify what coroutines are.
       
 (DIR) Post #9fgrtpUSkqWWptQ5RY by a_breakin_glass@cybre.space
       2019-02-10T11:30:25Z
       
       0 likes, 0 repeats
       
       @kragen @vertigo @loke @fribbledom or possibly like some cl libraries like cells
       
 (DIR) Post #9fhI4dipRl5qKZxFYm by a_breakin_glass@cybre.space
       2019-02-10T11:24:23Z
       
       0 likes, 0 repeats
       
       @vertigo @kragen @fribbledom>no type checkingunless you use a typed forth variant, although the only one I'm aware of is strongforth
       
 (DIR) Post #9fhI4eIHJxRw6WXYoa by thamesynne@dragon.style
       2019-02-10T11:25:56Z
       
       0 likes, 0 repeats
       
       @a_breakin_glass @vertigo @kragen @fribbledom factor probably counts too
       
 (DIR) Post #9fhI4eTGf6EwebgKwa by thamesynne@dragon.style
       2019-02-10T12:05:15Z
       
       0 likes, 0 repeats
       
       @a_breakin_glass @vertigo @kragen @fribbledom although to get back to the origin of the thread, i do wonder sometimes what the world would look like if the very first computers had been showered with ports of PDP-1 Lisp instead of Tiny BASIC
       
 (DIR) Post #9fhI4emlUbptd4nuKm by kragen@nerdculture.de
       2019-02-10T16:23:43Z
       
       0 likes, 0 repeats
       
       @thamesynne @a_breakin_glass @vertigo @fribbledom the very first home computers, you mean? I wonder if they had enough memory; Basic PDP-1 Lisp needed 2000 18-bit memory words http://s3data.computerhistory.org/pdp-1/DEC.pdp_1.1964.102650371.pdf while Tiny BASIC could run in 2KiB, less than half the memory. https://en.wikipedia.org/wiki/Tiny_BASIC
       
 (DIR) Post #9fhICcwmDDN2FBHsWW by kragen@nerdculture.de
       2019-02-10T16:25:10Z
       
       0 likes, 0 repeats
       
       @a_breakin_glass @vertigo @loke @fribbledom yeah, cells or The Trellis in Python or something like that might work
       
 (DIR) Post #9fhIbScTkKAvSBdPgu by kragen@nerdculture.de
       2019-02-10T16:29:39Z
       
       0 likes, 1 repeats
       
       @thamesynne @a_breakin_glass @vertigo @fribbledom Factor is dynamically typed, which is probably better for beginners than static typing systems like those of Cat and Joy, though either is probably better than just crashing with no error message
       
 (DIR) Post #9fhJDtGZ7haF6Ii3sW by kragen@nerdculture.de
       2019-02-10T16:36:35Z
       
       0 likes, 0 repeats
       
       @a_breakin_glass @vertigo @loke @fribbledom but you still need a *language* to be editing the Cells or Trellis formulas in, one that can be evaluated by the editing system without causing undesired side effects. I mean I've done this with Python for my own purposes but it acts pretty weird sometimes maybe it's not that suitable as an environment for beginners
       
 (DIR) Post #9fhJa8xTSeHg8iOIiG by vertigo@mastodon.social
       2019-02-10T16:38:47Z
       
       0 likes, 0 repeats
       
       @kragen @loke @fribbledom I'm confused; the existence of BASIC on a Commodore PET through 128 shows that's not the case.  The way it works is quite simple, and is also used with the Commodore machine-language monitors as well.Basically, your programs sit in a REPL where the "R" is just "Read a line of text".  That line is defined as whatever line the cursor happens to be on when the user presses the ENTER key.  Normally it's a new line, but it doesn't have to be.
       
 (DIR) Post #9fhJvd8ATYHznbdeJE by vertigo@mastodon.social
       2019-02-10T16:41:05Z
       
       0 likes, 0 repeats
       
       @kragen @loke @fribbledom You can, for example, cursor up to a BASIC line of your choice, make whatever changes you need to make, and press ENTER, and that whole line will be re-evaluated by BASIC.In the ML monitor, you'll notice strange symbols in column 0 or 1 for everything that it does.  To print the CPU registers, you use the R command, but to change them, again, cursor up, make your edits, and press RETURN.  The character in column 1 is actually the corresponding edit command.
       
 (DIR) Post #9fhJvdkSBCujiLYDz6 by kragen@nerdculture.de
       2019-02-10T16:44:28Z
       
       0 likes, 0 repeats
       
       @vertigo @loke @fribbledom Right, you have to press ENTER, which is equivalent to C-x C-e; that's adequate for imperative languages. Also in most cases, as I remember it, you would be editing a line of the program; that line wouldn't actually *execute* unless you invoke RUN and the program reaches it. Contrast that with Excel formulas or livecoding environments!
       
 (DIR) Post #9fhK1SX8mxlUfb6Bqy by kragen@nerdculture.de
       2019-02-10T16:45:34Z
       
       0 likes, 0 repeats
       
       @vertigo @loke @fribbledom actually I think in other Microsoft BASIC implementations (maybe not the Commodore?) you could CONT to continue the program after a change without losing your program state. RUN would indeed reset everything to zero
       
 (DIR) Post #9fhK42v6TY0C6FXaPw by vertigo@mastodon.social
       2019-02-10T16:46:00Z
       
       0 likes, 0 repeats
       
       @kragen @loke @fribbledom That depends.  I've hot-coded software (frequently the kernel!) with the ML monitor before.  ;)  Do be careful when doing this though.  Way too easy to crash the system.
       
 (DIR) Post #9fhK9lI2w6h99uzuKG by kragen@nerdculture.de
       2019-02-10T16:47:04Z
       
       0 likes, 0 repeats
       
       @vertigo @loke @fribbledom If you had a checkpoint and restart facility so that you could back up to before the crash, maybe it wouldn't be so bad!
       
 (DIR) Post #9fhKC2Cb1p7CBqX2uG by vertigo@mastodon.social
       2019-02-10T16:47:28Z
       
       0 likes, 0 repeats
       
       @kragen @loke @fribbledom Only if any program changes doesn't cause the tokenized version of the program to expand.  Put another way, as long as you don't *grow* the binary image of the program, you're safe to use CONT.  Commodore BASIC is the same way, IIRC.
       
 (DIR) Post #9fhKTqMhmDB8gtkhw8 by vertigo@mastodon.social
       2019-02-10T16:50:41Z
       
       0 likes, 0 repeats
       
       @kragen @loke @fribbledom And they say microkernels are useless.  ;)
       
 (DIR) Post #9fhKgyoVg15Ier1PHs by kragen@nerdculture.de
       2019-02-10T16:53:03Z
       
       0 likes, 0 repeats
       
       @vertigo @loke @fribbledom Full virtualization is especially useful for this kind of thing — you can test the changes to your kernel in QEMU with a remote debugger or whatever before running your whole machine on it.
       
 (DIR) Post #9fhT7Gpu5j17iSilCy by suetanvil@mastodon.technology
       2019-02-10T18:27:27Z
       
       0 likes, 0 repeats
       
       @kragen @vertigo @fribbledom As a kid, the perceptions was that Logo was a kids’ toy while BASIC was a Real Language that Real Coding was done in.If LOGO had been extended into a Serious Language, we would have taken it seriously. (Or if someone had explained to me that LOGO was really just Lisp.)
       
 (DIR) Post #9fhV2nuPtr76kYpVeC by kragen@nerdculture.de
       2019-02-10T18:49:04Z
       
       0 likes, 0 repeats
       
       @suetanvil @vertigo @fribbledom I think in that period a lot of us had the perception that neither Lisp nor BASIC was a Serious Language because it was too slow. On an 0.3 MIPS machine that matters a lot. Lisp (and LOGO) also had the problem that garbage collection uses a lot of memory — especially prior to the invention of generational GC.
       
 (DIR) Post #9fhVdcAGWniR2XnOAi by deejoe@mastodon.sdf.org
       2019-02-10T17:19:46Z
       
       0 likes, 0 repeats
       
       @loke @kragen @vertigo @fribbledom Dijkstra's quote about BASIC learners (CW: ableist) is something of a worrystone about my own path in computing, which started with BASIC too.
       
 (DIR) Post #9fhVddIoIW92ZKnj84 by vertigo@mastodon.social
       2019-02-10T17:44:15Z
       
       0 likes, 0 repeats
       
       @deejoe @loke @kragen @fribbledom Yep, I have a great deal of respect for Djikstra's technical and logical proficiency when it comes to structured programming and how one could apply formal verification to it; however, he has been unquestionably wrong on some of his more opinionated pieces.
       
 (DIR) Post #9fhVdeUXsN7sG1Ic3k by a_breakin_glass@cybre.space
       2019-02-10T18:09:25Z
       
       0 likes, 0 repeats
       
       @vertigo @deejoe @loke @kragen @fribbledom this also applies to djikstra's opinion of APL, arguably
       
 (DIR) Post #9fhVdfPGTUUp5vpuT2 by kragen@nerdculture.de
       2019-02-10T18:55:41Z
       
       0 likes, 0 repeats
       
       @a_breakin_glass @vertigo @deejoe @loke @fribbledom It's not clear. Certainly APL/A+/J/K has progressively lost popularity since its heyday in the 1960s, largely being supplanted by VisiCalc, Pascal, and awk, suggesting that perhaps it wasn't really the language of the future. But MATLAB, Numpy, R, and now TensorFlow are super important nowadays, and progressively more so.
       
 (DIR) Post #9fhXYb5jI4N6AjPQmm by thomasfuchs@mastodon.social
       2019-02-08T03:12:24Z
       
       0 likes, 0 repeats
       
       @fribbledom IDK some of the stuff like the drawing commands and music language is pretty powerful!
       
 (DIR) Post #9fhXYbFIiU1mePt4hk by suetanvil@mastodon.technology
       2019-02-10T18:57:38Z
       
       0 likes, 0 repeats
       
       @thomasfuchs @fribbledom IME, the main problem with basic was that it didn’t let you define new words. That’s fundamental to programming but it wasn’t available. Learning C afterward was a huge revelation both because I could define functions and because the existing functions I was using were just like the ones I’d already created.
       
 (DIR) Post #9fhXYbVxiXM5U5gNfs by kragen@nerdculture.de
       2019-02-10T19:17:12Z
       
       0 likes, 0 repeats
       
       @suetanvil @thomasfuchs @fribbledom Yeah, this was a big deal to me too. You can do GOSUB in BASIC, but when the same thing looks so different, it's hard to recognize it as the same thing.
       
 (DIR) Post #9fhYe3cyijq44t82DY by vertigo@mastodon.social
       2019-02-10T19:29:24Z
       
       0 likes, 0 repeats
       
       @kragen @a_breakin_glass @deejoe @loke @fribbledom I don't think APL ever intended to be the language of the future.
       
 (DIR) Post #9fhh6KxOi9zQmBP8fA by kragen@nerdculture.de
       2019-02-10T21:04:09Z
       
       0 likes, 0 repeats
       
       @vertigo @a_breakin_glass @deejoe @loke @fribbledom On the contrary, Iverson regarded it as a substantial advance in notation
       
 (DIR) Post #9fhhcx7wKbrMQPt6oq by vertigo@mastodon.social
       2019-02-10T21:10:03Z
       
       0 likes, 0 repeats
       
       @kragen @a_breakin_glass @deejoe @loke @fribbledom Yes, for creating algorithms.  APL wasn't even intended to be an executable notation until one of his students proved that it could be done.Fun fact, the exact same circumstances existed with Lisp as well.
       
 (DIR) Post #9fhtRfuZCJHNIQ1ObI by vertigo@mastodon.social
       2019-02-10T21:11:42Z
       
       0 likes, 0 repeats
       
       @kragen @a_breakin_glass @deejoe @loke @fribbledom That said, I do find it damn sad that APL/J/K are not more widespread.  They are, notationally, significantly more convenient to use than the corresponding Lisp constructs, and they'd make excellent languages to replace BASIC as a system to boot into on an 80s-era home computer.
       
 (DIR) Post #9fhtRgBw9jAqAI9Gfw by kragen@nerdculture.de
       2019-02-10T23:17:33Z
       
       0 likes, 0 repeats
       
       @vertigo @a_breakin_glass @deejoe @loke @fribbledom I don't know why you couldn't have done an APL for an 80s-era home computer; have you seen http://code.jsoftware.com/wiki/Essays/Incunabulum?
       
 (DIR) Post #9fhthtAbfaGN3CFtdA by vertigo@mastodon.social
       2019-02-10T23:25:23Z
       
       0 likes, 0 repeats
       
       @kragen @a_breakin_glass @deejoe @loke @fribbledom Yep.  An APL would have been, I claim, easier to implement than a typical Lisp environment, because it could rely upon reference-counting to productively manage memory (since APL, like Python, very-very-very strongly encourages data laid out like a tree, and takes advantage of this being normally the case).
       
 (DIR) Post #9fhvORuSzyRCuVZ32W by vertigo@mastodon.social
       2019-02-10T23:26:11Z
       
       0 likes, 0 repeats
       
       @kragen @a_breakin_glass @deejoe @loke @fribbledom  It'd also amortize the overhead of interpretation a lot better as well, since you can get away without needing a byte-code compiler and barely make an impact on runtime performance.(Evidence of this is from K, which finds extensive use in HFT and other financial applications; yet K is *string interpreted*, not tokenized at all.  Also, I've proven this seems to be the case with J as well in a blog article I wrote.)
       
 (DIR) Post #9fhvOS9i5IdBfmhDnc by vertigo@mastodon.social
       2019-02-10T23:26:46Z
       
       0 likes, 1 repeats
       
       @kragen @a_breakin_glass @deejoe @loke @fribbledom https://sam-falvo.github.io/2014/01/05/subroutine-performance-in-j for those interested in reading it.