[HN Gopher] I built my own 16-Bit CPU in Excel [video]
___________________________________________________________________
I built my own 16-Bit CPU in Excel [video]
Author : SushiHippie
Score : 85 points
Date : 2024-01-28 05:23 UTC (17 hours ago)
(HTM) web link (www.youtube.com)
(TXT) w3m dump (www.youtube.com)
| kmstout wrote:
| "Is it even possible? It's the best kind of possible:
| theoretically possible."
| Nevermark wrote:
| But can you now run it in a homebrew "tiny sheet" running on a
| tiny sheet version of your 16-Bit CPU?
|
| I love that people do these ridiculous but inventive things.
| Never would have imagined this one.
| nayuki wrote:
| That reminds me of the metapixel in the Game of Life, which is
| a large structure that emulates one cell.
| https://conwaylife.com/wiki/OTCA_metapixel
| b33j0r wrote:
| I am super disappointed in the lack of evolution of dataflow, but
| am encouraged to see things like airtable, and I guess blender
| and etc using node-based interfaces for functional logic.
|
| I did my senior thesis/project in CS (we had to do several, it
| was anticlimactic) about visual programming, and basic paradigms
| that might be the future.
|
| I ended up writing a missive about labview holding people back,
| because 2D planes suck at communicating information to people who
| otherwise read books and blogs and C# code.
|
| My conclusion 15 years later is that we'll talk to LLMs and their
| successors rather than invent a great graphical user interface
| that works like a desktop or a <table> or even a repl.
|
| Star Trek may have inspired the ipad and terrible polygon
| capacitive touchscreens... but we all know that "Computer, search
| for M-class planets without fans of Nickelback's second album
| living there as of stardate 2024" is already basically a reality.
|
| EDIT: I like this CPU experiment too! It is a great example of
| the thing I'm talking about. Realized after the fact that I
| failed to plant my context in my comment, before doing my
| graybeard routine.
|
| So. Food for thought, our LLM overlords are just unfathomable
| spreadsheets.
| danbruc wrote:
| Graphical programming just does not work, it has been tried
| often enough. As soon as you step beyond toy examples, you need
| a hierarchical organization, functions calling functions
| calling functions. How do you represent that graphically? You
| put additional graphs side by side or allow some kind of drill
| down. Now all your graphs are pretty trivial and you have not
| gained much over a hand full lines of code but you have reduced
| the density a lot with all the space between nodes and all the
| arrows.
|
| Natural language programming is not going to happen either
| because natural languages are to ambiguous. You can probably
| write code iteratively in a natural language in some kind of
| dialog clarifying things as ambiguities arise, but using that
| dialog as the source of truth and treating the resulting code
| as a derivative output sounds not very useful to me. So if I
| had to bet, I would bet that text based programming languages
| are not going anywhere soon.
|
| Maybe one day there will be no code at all, everything will
| just contain small artificial brains doing the things we want
| them to do without anything we would recognize as a program
| today, but who knows and seems not really worth spaculating
| about to me.
|
| In the nearer term I could see domain specific languages
| becoming a more prevalent thing. A huge amount of the code we
| write are technical details because we have to express even the
| highest level business logic in terms of booleans, integers and
| strings. If we had a dozen different languages tailored to
| different aspects of an application, we could write a lot less
| code.
|
| We have this to a certain extent, a language for code in
| general, one for querying data, one for laying out the user
| interface, one for styling it. But they are badly integrated
| and not customizable. The problem is of course that developing
| good languages and evolving them is hard and lowering them into
| some base language is tedious work. But in principle I could
| imagine that progress is possible at this front and that this
| becomes practical.
| f1shy wrote:
| >> you need a hierarchical organization, functions calling
| functions calling functions. How do you represent that
| graphically?
|
| Not saying graphical programming is a good idea, but the
| basic abstraction mechanism is to define new boxes, which you
| can look inside by opening (like some VHDL modeling tools
| do). Even in SICP it is said it is a bad idea and do not
| escalate. But is clear that the primitive is "the box" the
| means of combination, the lines between boxes, and mean of
| abstraction is making new boxes.
|
| I thin the real problem is that there is exactly 1 primitive,
| one mean of abstraction, and one of combination, and that
| seems to not be enough.
| danbruc wrote:
| What else could you have? Whatever you are building, you
| will always have some primitives and ways of combining them
| and that's practically it. To make things more manageable,
| you start abstracting, give names to things and refer to
| them by name instead of by their structure. The next level
| up would probably by parameterization, instead of having a
| name for a thing you have a name plus parameters for a
| family of things. Maybe before that you could get a bit
| more fancy with instantiation, allow things like
| repetition. But that again is pretty much it, make
| parameterized instantiation a bit more fancy and you will
| quickly create a Turing complete meta layer capable of
| generating arbitrary constructs in the layer we started
| with.
| pilgrim0 wrote:
| > If we had a dozen different languages tailored to different
| aspects of an application, we could write a lot less code.
|
| I think this has potential. As we all known, natural language
| is a weak tool to express logic. On the other hand,
| programming languages are limited by their feature set and
| paradigmatic alignment. But whatever code language we use to
| express a particular software product, the yield for the end
| user is virtually the same. I mean, how the logic is laid and
| worked out practically has no effect on the perceived
| functionality, i.e. a button programmed to display an alert
| on the screen can be programmed in numerous languages but the
| effect is always the same. If however we had like drivers and
| APIs for everything we could possibly need in the course of
| designing a program, then we could just emit structured data
| to endpoints in a data flow fashion, such that the program is
| manifested as a managed activation pattern. In this scenario,
| different APIs could have different schemas and those
| effectively be synthesized through specialized syntax, hence
| nano DSLs for each task. It would be not so different,
| conceptually, from the very same ISAs embedded in processors:
| each instruction has its own syntax and semantics, it's only
| very regular and simple. But for the scenario of pure
| composability to work at a high level, we would need to fully
| rework the ecosystem and platforms. I mean, in this context a
| single computer would need to work like a distributed system
| with homogeneous design and tightly integrated semantics for
| all its resident components.
| b33j0r wrote:
| I don't anticipate "natural language programming." I
| anticipate systems that maintain a declarative state of
| running services, based on requirements and
| iteration/adversarial chore-checking.
|
| You won't be using ChatGPT to write source code that you copy
| and paste, and debug.
|
| You'll be saying "no she was wearing a shorter dress, with
| flowers," like Geordi LaForge using the holodeck to solve
| mysteries.
|
| The boilerplate below won't even be necessary. Here's how I
| see it working though:
|
| "Hey, welcome to Earth. So here's the deal. You maintain my
| website that drop-ships pokemon cards using paypal merchant
| integration. You will have a team of AI designers who you
| will hire for specific skills by designing a plan with
| detailed job descriptions.
|
| I want one guy to just make funny comments in the PR history.
| Make it look like cyberpunk. Respect EU privacy laws by
| maintaining regional databases, and hire another agent who
| has a JD to follow similar regulatory requirements in the
| news.
|
| I hate oracle databases and j2ee, use anything else ;)"
| dukoid wrote:
| I am working on a side project related to dataflow and would
| like to see some input. Is there a simple way to get into
| contact?
| analog31 wrote:
| Does dataflow necessarily require a graphical interface? My
| experience with LabVIEW was simply the sheer amount of manual
| labor required to write more than a trivial program, and the
| eyestrain headaches that came with it.
|
| I did a huge amount of Excel with elaborate VB macros. Thinking
| back, it strikes me as odd that a dataflow programming tool
| used a conventional language as its macro language.
| robblbobbl wrote:
| Impressive. Thanks for sharing!
| dkekenflxlf wrote:
| TBH, the oooonly question is:
|
| Can it run Crysis?
|
| :D
| nayuki wrote:
| I built AES and DES in Excel almost about a decade ago. Note that
| these are combinational circuits, not sequential - so no feedback
| or clock are required. https://www.nayuki.io/page/aes-cipher-
| internals-in-excel , https://www.nayuki.io/page/des-cipher-
| internals-in-excel
| linkdd wrote:
| The video is already 1 day old. Where is the DOOM port?
___________________________________________________________________
(page generated 2024-01-28 23:02 UTC)