[HN Gopher] ML on Apple ][+
___________________________________________________________________
ML on Apple ][+
Author : mcramer
Score : 88 points
Date : 2025-09-29 16:12 UTC (6 hours ago)
(HTM) web link (mdcramer.github.io)
(TXT) w3m dump (mdcramer.github.io)
| rob_c wrote:
| Since when did regression get upgraded to full blown ML?
| nekudotayim wrote:
| What is ML if not interpolation and extrapolation?
| magic_hamster wrote:
| A million things.
|
| Diffusion, back propagation, attention, to name a few.
| have-a-break wrote:
| Back prop and attention are just extensions of
| interpolation.
| rob_c wrote:
| By that logic it's all "just linear maths".
|
| Back prop requires and limits to analytically
| differentiable in a normal way.
|
| Attention is... Oh dear comparing linear regression to
| attention is comparing a diesel jet engine to a horse.
| aleph_naught wrote:
| It's all just a series of S(S(S(....S(0)))) anyways.
| stonogo wrote:
| When you find yourself solving NP-hard problems on an Apple II,
| chances are strong you've entered machine learning territory
| DonHopkins wrote:
| Since when did ML get upgraded to full blown AI?
| drob518 wrote:
| Upvoted purely for nostalgia.
| gwbas1c wrote:
| Any particular reason why the author chose to do this on an Apple
| ][?
|
| (I mean, the pictures look cool and all.)
|
| IE, did the author want to experiment with older forms of basic;
| or were they trying to learn more about old computers?
| shagie wrote:
| One of my early "this is _neat_ " programs was a genetic
| algorithm in Pascal. You entered a bunch of digits and it
| "evolved" the same sequence of digits. It started out with 10
| random numbers. Their fitness (lower was better) was the sum the
| difference. So if the target was "123456" and the test number was
| "214365", it had a fitness of 6. It took the top 5, and then
| mutated a random digit by a random +/- 1. It printed out each row
| with the full population. and so you could see it scrolling as it
| converged on the target number.
|
| Looking back, I want to say it was _probably_ the July, 1992
| issue of Scientific American that inspired me to write that (
| https://www.geos.ed.ac.uk/~mscgis/12-13/s1100074/Holland.pdf ) .
| And as that was '92, this _might_ have been on a Mac rather than
| an Apple ][+... it was certainly in Pascal (my first class in C
| was in August '92) and I had access to both at the time (I don't
| think it was turbo pascal on a PC as this was a summer thing and
| I didn't have a IBM PC at home at the time). Alas, I remember
| more about the specifics of the program than I do about what desk
| I was sitting at.
| Steeeve wrote:
| I wrote a whole project in pascal around that time. Analyzing
| two datasets. It was running out of memory the night before it
| was due, so I decided to have it run twice, once for each
| dataset.
|
| That's when I learned a very important principal. "When
| something needs doing quickly, don't force artificial
| constraints on yourself"
|
| I could have spent three days figuring out how to deal with the
| memory constraints. But instead I just cut the data in half and
| gave it two runs. The quick solution was the one that was
| needed. Kind of an important memory for me that I have thought
| about quite a bit in the last 30+ years.
| aardvark179 wrote:
| I thought this was going to be about the programming language,
| and I was wondering how they managed to implement it on a machine
| that small.
| Scramblejams wrote:
| Same. What flavor of ML would be the most appropriate for that
| challenge, do you think?
| taolson wrote:
| While not exactly ML, David Turner's Miranda system is pretty
| small, and might be feasible:
|
| https://codeberg.org/DATurner/miranda
| noelwelsh wrote:
| That's also what I was thinking. ML predates the Apple II by 4
| years, so I think there is definitely a chance of getting it
| running! If targetting the Apple IIGS I think it would be very
| achievable; you could fit _megabytes_ of RAM in those.
| amilios wrote:
| Bit of a weird choice to draw a decision boundary for a
| clustering algorithm...
| aperrien wrote:
| An Aeon ago in 1984, I wrote a perceptron on the Apple II. It was
| amazingly slow (20 minutes to complete a recognition pass), but
| what most impressed me at the time was that it did work. Since
| that time as a kid I always wondered just how far linear
| optimization techniques could take us. If I could just tell
| myself then what I know now...
| alexshendi wrote:
| This motivates me to try this on my Ministrel 4th (21th century
| Jupiter Ace clone).
___________________________________________________________________
(page generated 2025-09-29 23:00 UTC)