Posts by dougmerritt@mathstodon.xyz
(DIR) Post #AuN1DhxaWDiUrv7lfU by dougmerritt@mathstodon.xyz
2025-05-23T01:27:40Z
0 likes, 1 repeats
@screwlisp @the_dot_matrix I don't have a substantive comment, but I do want to point out semi-related technical work that shows a fairly trivial way to upgrade your display art:Tiny Petting Zoo@TinyPettingZoomas.toe.g.https://mathstodon.xyz/@TinyPettingZoo@mas.to/114549684571383575SimilarlyEmojiMeadow@EmojiMeadowPastoral emoji gridhttps://mathstodon.xyz/@EmojiMeadow@mas.to/114553675534828722Emoji Aquarium@EmojiAquariumhttps://mathstodon.xyz/@EmojiAquarium@mas.to/114553862016357558Projects (the connection between each project and a twitter or mastodon account such as above is unforunately somewhat opaque, but oh well):https://gitlab.com/users/JoeSondow/projects
(DIR) Post #AuddxxRiuSrf5pu91E by dougmerritt@mathstodon.xyz
2025-05-30T22:36:42Z
0 likes, 1 repeats
@jnpnMentioning APL gives me an excuse to mention my late friend's fairly famous joke:"There are three things a man must dobefore his life is done;Write two lines in APL,And make the buggers run."-- Stan Kelly-Bootle (from his "Devil's DP Dictionary")P.S. he was a fan of APL and was e.g. well known in SIGAPL meetings
(DIR) Post #AuqIfMxwES0KkwkNDE by dougmerritt@mathstodon.xyz
2025-06-05T18:31:23Z
0 likes, 1 repeats
@vnikolov Indeed it is!!!@screwlisp @mdhughes @alex @jasmaz @jeremy_list @spidercat
(DIR) Post #AusT4hFHMDlskYUeSe by dougmerritt@mathstodon.xyz
2025-06-07T05:02:55Z
0 likes, 0 repeats
@mdhughes Then you may be amused by Brainfuck implemented in Binary Lambda Calculus, 829 bit ("bit") implementation.https://tromp.github.io/cl/Binary_lambda_calculus.html#brainfuck
(DIR) Post #Aute25WVvg2XpdTQXo by dougmerritt@mathstodon.xyz
2025-06-07T19:18:04Z
0 likes, 0 repeats
@twylo @ChuckMcManis Just last night I was saying that Polaroids are one of the few widely used film solutions that don't have a direct digital replacement.Someone probably *could* add a tiny color printer to a digital camera, but would it be as good? In any case I haven't seen such a thing.
(DIR) Post #Av8FNyzchaeCHjYW3c by dougmerritt@mathstodon.xyz
2025-06-14T20:21:11Z
0 likes, 0 repeats
@smashedratonpress "Miscellany β 109, now up! What is it that makes the full stop so terrifying to younger Japanese people? That isn't a rhetorical question: I really want to know! Read on to learn more"https://mathstodon.xyz/@shadychars@mastodon.social/114676725768393583
(DIR) Post #AvGoBFbQgfPlh9aCtk by dougmerritt@mathstodon.xyz
2025-06-18T23:01:11Z
0 likes, 1 repeats
@mdhughes > So until actual source control in the late '80s/early '90sHistories tend to leave out the earliest history, and this is another example.SCCS counts; it dates to 1973, depending on features.Also, although Larry Wall deserves credit for the feature set of patch(1), diff itself supported patching from its earliest days (1974..1976 or some such):> In diff's early years, common uses included comparing changes in the source of software code and markup for technical documents, verifying program debugging output, comparing filesystem listings and analyzing computer assembly code. The output targeted for ed was motivated to provide compression for a sequence of modifications made to a file.[citation needed] The Source Code Control System (SCCS) and its ability to archive revisions emerged in the late 1970s as a consequence of storing edit scripts from diff.https://en.wikipedia.org/wiki/Diff#Historyhttps://en.wikipedia.org/wiki/Source_Code_Control_System@screwlisp @vindarel @khinsen @kentpitman
(DIR) Post #AvRTXesm3TRtXnvAxc by dougmerritt@mathstodon.xyz
2025-06-24T01:33:24Z
0 likes, 0 repeats
@AmenZwa Hmm? What do you mean, "can no longer read" them??
(DIR) Post #AvRTXj5KQjrsZtpG5I by dougmerritt@mathstodon.xyz
2025-06-24T02:20:39Z
0 likes, 1 repeats
@screwlisp Interpolating that Natali post for convenience:"ππ. πππ‘ππ―π’π¨π«ππ₯ ππ§π ππ¨π π§π’ππ’π―π ππ§π ππ ππ¦ππ§π- Quoting Ability: LLM users failed to quote accurately, while Brain-only participants showed robust recall and quoting skills.- Ownership: Brain-only group claimed full ownership of their work; LLM users expressed either no ownership or partial ownership.- Critical Thinking: Brain-only participants cared more about πΈπ©π’π΅ and πΈπ©πΊ they wrote; LLM users focused on π©π°πΈ.- Cognitive Debt: Repeated LLM use led to shallow content repetition and reduced critical engagement. This suggests a buildup of "cognitive debt", deferring mental effort at the cost of long-term cognitive depth."@ChuckMcManis @glitzersachen @AmenZwa
(DIR) Post #Az3M7dHB8Gic6rlPCS by dougmerritt@mathstodon.xyz
2025-10-10T03:12:52Z
0 likes, 0 repeats
@Twisted666 @SecureOwl When the intent of humor transcends the vocabulary of humor.
(DIR) Post #AziNfbejpBG7Ur3fsG by dougmerritt@mathstodon.xyz
2025-10-24T21:43:25Z
0 likes, 0 repeats
@nixCraft There are in fact browser-from-scratch projects, but the general problem is that backward compatibility issues are huge, and always have been, and just get worse each year.
(DIR) Post #B0S0brSoX8reM1cNjU by dougmerritt@mathstodon.xyz
2025-11-20T20:48:49Z
0 likes, 1 repeats
Preserving code that shaped generations: Zork I, II, and III go Open Source"Today, weβre preserving a cornerstone of gaming history that is near and dear to our hearts. Together, Microsoftβs Open Source Programs Office (OSPO), Team Xbox, and Activision are making Zork I, Zork II, and Zork III available under the MIT License. Our goal is simple: to place historically important code in the hands of students, teachers, and developers so they can study it, learn from it, and, perhaps most importantly, play it."...The games remain commercially available viaβ―The Zork Anthology on Good Old Games."https://opensource.microsoft.com/blog/2025/11/20/preserving-code-that-shaped-generations-zork-i-ii-and-iii-go-open-sourcehttps://github.com/historicalsource/zork1And an Ars Technica article about that:https://arstechnica.com/gaming/2025/11/microsoft-makes-zork-i-ii-and-iii-open-source-under-mit-license/I originally beat Zork in the original PDP-10 version, arpanet-ing in to MIT to play it.
(DIR) Post #B0jjoC4ynZnvU0vANE by dougmerritt@mathstodon.xyz
2025-11-23T02:42:49Z
0 likes, 2 repeats
A Brief, Incomplete, and Mostly Wrong History of Programming LanguagesJames Iry; Thursday, May 7, 20091801 - Joseph Marie Jacquard uses punch cards to instruct a loom to weave "hello, world" into a tapestry. Redditers of the time are not impressed due to the lack of tail call recursion, concurrency, or proper capitalization.1842 - Ada Lovelace writes the first program. She is hampered in her efforts by the minor inconvenience that she doesn't have any actual computers to run her code. Enterprise architects will later relearn her techniques in order to program in UML.1936 - Alan Turing invents every programming language that will ever be but is shanghaied by British Intelligence to be 007 before he can patent them.1936 - Alonzo Church also invents every language that will ever be but does it better. His lambda calculus is ignored because it is insufficiently C-like. This criticism occurs in spite of the fact that C has not yet been invented.1940s - Various "computers" are "programmed" using direct wiring and switches. Engineers do this in order to avoid the tabs vs spaces debate.1957 - John Backus and IBM create FORTRAN. There's nothing funny about IBM or FORTRAN. It is a syntax error to write FORTRAN while not wearing a blue tie.1/6
(DIR) Post #B0jjoKTjZQMJX6Z34K by dougmerritt@mathstodon.xyz
2025-11-23T02:44:15Z
0 likes, 0 repeats
1958 - John McCarthy and Paul Graham invent LISP. Due to high costs caused by a post-war depletion of the strategic parentheses reserve LISP never becomes popular[1]. In spite of its lack of popularity, LISP (now "Lisp" or sometimes "Arc") remains an influential language in "key algorithmic techniques such as recursion and condescension"[2].1959 - After losing a bet with L. Ron Hubbard, Grace Hopper and several other sadists invent the Capitalization Of Boilerplate Oriented Language (COBOL) . Years later, in a misguided and sexist retaliation against Adm. Hopper's COBOL work, Ruby conferences frequently feature misogynistic material.1964 - John Kemeny and Thomas Kurtz create BASIC, an unstructured programming language for non-computer scientists.1965 - Kemeny and Kurtz go to 1964.1970 - Guy Steele and Gerald Sussman create Scheme. Their work leads to a series of "Lambda the Ultimate" papers culminating in "Lambda the Ultimate Kitchen Utensil." This paper becomes the basis for a long running, but ultimately unsuccessful run of late night infomercials. Lambdas are relegated to relative obscurity until Java makes them popular by not having them.1970 - Niklaus Wirth creates Pascal, a procedural language. Critics immediately denounce Pascal because it uses "x := x + y" syntax instead of the more familiar C-like "x = x + y". This criticism happens in spite of the fact that C has not yet been invented.1972 - Dennis Ritchie invents a powerful gun that shoots both forward and backward simultaneously. Not satisfied with the number of deaths and permanent maimings from that invention he invents C and Unix.2/6
(DIR) Post #B0jjoSbTMXIojQFLF2 by dougmerritt@mathstodon.xyz
2025-11-23T02:47:45Z
0 likes, 0 repeats
1973 - Robin Milner creates ML, a language based on the M&M type theory. ML begets SML which has a formally specified semantics. When asked for a formal semantics of the formal semantics Milner's head explodes. Other well known languages in the ML family include OCaml, F#, and Visual Basic.1980 - Alan Kay creates Smalltalk and invents the term "object oriented." When asked what that means he replies, "Smalltalk programs are just objects." When asked what objects are made of he replies, "objects." When asked again he says "look, it's all objects all the way down. Until you reach turtles."1983 - In honor of Ada Lovelace's ability to create programs that never ran, Jean Ichbiah and the US Department of Defense create the Ada programming language. In spite of the lack of evidence that any significant Ada program is ever completed historians believe Ada to be a successful public works project that keeps several thousand roving defense contractors out of gangs.1983 - Bjarne Stroustrup bolts everything he's ever heard of onto C to create C++. The resulting language is so complex that programs must be sent to the future to be compiled by the Skynet artificial intelligence. Build times suffer. Skynet's motives for performing the service remain unclear but spokespeople from the future say "there is nothing to be concerned about, baby," in an Austrian accented monotones. There is some speculation that Skynet is nothing more than a pretentious buffer overrun.3/6
(DIR) Post #B0jjobCHPZUxNtWqiu by dougmerritt@mathstodon.xyz
2025-11-23T02:48:48Z
0 likes, 0 repeats
1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic.1987 - Larry Wall falls asleep and hits Larry Wall's forehead on the keyboard. Upon waking Larry Wall decides that the string of characters on Larry Wall's monitor isn't random but an example program in a programming language that God wants His prophet, Larry Wall, to design. Perl is born.1990 - A committee formed by Simon Peyton-Jones, Paul Hudak, Philip Wadler, Ashton Kutcher, and People for the Ethical Treatment of Animals creates Haskell, a pure, non-strict, functional language. Haskell gets some resistance due to the complexity of using monads to control side effects. Wadler tries to appease critics by explaining that "a monad is a monoid in the category of endofunctors, what's the problem?"1991 - Dutch programmer Guido van Rossum travels to Argentina for a mysterious operation. He returns with a large cranial scar, invents Python, is declared Dictator for Life by legions of followers, and announces to the world that "There Is Only One Way to Do It." Poland becomes nervous.1995 - At a neighborhood Italian restaurant Rasmus Lerdorf realizes that his plate of spaghetti is an excellent model for understanding the World Wide Web and that web applications should mimic their medium. On the back of his napkin he designs Programmable Hyperlinked Pasta (PHP). PHP documentation remains on that napkin to this day.4/6
(DIR) Post #B0jjojHtKajyTcDRa4 by dougmerritt@mathstodon.xyz
2025-11-23T02:49:43Z
0 likes, 0 repeats
1995 - Yukihiro "Mad Matz" Matsumoto creates Ruby to avert some vaguely unspecified apocalypse that will leave Australia a desert run by mohawked warriors and Tina Turner. The language is later renamed Ruby on Rails by its real inventor, David Heinemeier Hansson. [The bit about Matsumoto inventing a language called Ruby never happened and better be removed in the next revision of this article - DHH].1995 - Brendan Eich reads up on every mistake ever made in designing a programming language, invents a few more, and creates LiveScript. Later, in an effort to cash in on the popularity of Java the language is renamed JavaScript. Later still, in an effort to cash in on the popularity of skin diseases the language is renamed ECMAScript.1996 - James Gosling invents Java. Java is a relatively verbose, garbage collected, class based, statically typed, single dispatch, object oriented language with single implementation inheritance and multiple interface inheritance. Sun loudly heralds Java's novelty.5/6
(DIR) Post #B0jjorH7dKsXFXuwUa by dougmerritt@mathstodon.xyz
2025-11-23T02:49:59Z
0 likes, 0 repeats
2001 - Anders Hejlsberg invents C#. C# is a relatively verbose, garbage collected, class based, statically typed, single dispatch, object oriented language with single implementation inheritance and multiple interface inheritance. Microsoft loudly heralds C#'s novelty.2003 - A drunken Martin Odersky sees a Reese's Peanut Butter Cup ad featuring somebody's peanut butter getting on somebody else's chocolate and has an idea. He creates Scala, a language that unifies constructs from both object oriented and functional languages. This pisses off both groups and each promptly declares jihad.Footnotes[1] Fortunately for computer science the supply of curly braces and angle brackets remains high.[2] Catch as catch can - Verity Stobhttps://james-iry.blogspot.com/2009/05/brief-incomplete-and-mostly-wrong.html6/6
(DIR) Post #B15NeFFeE762uYsdhA by dougmerritt@mathstodon.xyz
2025-12-09T04:40:39Z
1 likes, 1 repeats
@screwlisp I just started to watch this:What Happened to Gopher? The Internet We Losthttps://www.youtube.com/watch?v=Flo9kn_nhbg
(DIR) Post #B28zsQXIRGVd1Bn6Ey by dougmerritt@mathstodon.xyz
2026-01-10T07:33:47Z
0 likes, 0 repeats
"Intelligence is not a ladder, with steps along one dimension; it is multidimensional, a radiation. The space of possible intelligences is very large, even vast, with human intelligence occupying a tiny spot at the edge of this galaxy of possible minds. Every other possible mind is alien, and we have begun the very long process of populating this space with thousands of other species of possible minds."-- Kevin Kelly; "Artificial Intelligences, So Far"; 2025 Oct 20;https://kevinkelly.substack.com/p/artificial-intelligences-so-far