[HN Gopher] The Soul of an Old Machine: Revisiting the Timeless ...
___________________________________________________________________
The Soul of an Old Machine: Revisiting the Timeless von Neumann
Architecture
Author : todsacerdoti
Score : 127 points
Date : 2024-11-12 04:31 UTC (18 hours ago)
(HTM) web link (ankush.dev)
(TXT) w3m dump (ankush.dev)
| 082349872349872 wrote:
| the paper:
| https://www.ias.edu/sites/default/files/library/Prelim_Disc_...
|
| > _6.8.5 ... There should be some means by which the computer can
| signal to the operator when a computation has been concluded.
| Hence an order is needed ... to flash a light or ring a bell._
|
| (later operators would discover than an AM radio placed near
| their CPU would also provide audible indication of process
| status)
| jagged-chisel wrote:
| I have experienced two or three computers in my life (and least
| one laptop) that produced barely audible sound when CPU
| activity changed. The most memorable was a PowerBook G4 with a
| touchpad, and as I slid my finger slowly across the pad in a
| quiet enough room, I could hear kind of a tick each time the
| pointer moved a pixel.
| simne wrote:
| This was very common with first sound cards (at least on PC).
| As I remember, only with Creative Sound Blaster, completely
| finished era of continuous hearing of "machine soul" sounds.
| dantondwa wrote:
| Still happens to me when using software that is GPU
| intensive, like Blender. When I drag a slider, I hear the
| buzzing of the GPU.
| Arainach wrote:
| That's not what's being described in this thread. You're
| referring to fan noise; the other comments are discussing
| electrical interference.
| camtarn wrote:
| Not necessarily. My GPU audibly sings in the KhZ range
| whenever it comes off idle, a long time before the fans
| actually start up. It could be electrical noise from the
| fan drivers and/or motor coils if they're running at low
| speed, but it's definitely not the sound of air movement.
| And if you're e.g. processing photos on the GPU, you can
| hear it starting and stopping, exactly synced to the
| progress of each photo in the UI.
| magicalhippo wrote:
| Similarly, on-board soundcards are notorious for this. Even in
| my relatively recent laptop I can judge activity by the noise
| in my headphones if I use the built-in soundcard, thanks to
| electrical interference. Especially on one motherboard I had, I
| could relatively accurately monitor CPU activity this way.
|
| There's also audible noise that can sometimes be heard from
| singing capacitors[1] and coil whine[2], as mentioned in a
| sibling comment.
|
| [1]:
| https://product.tdk.com/system/files/contents/faq/capacitors...
|
| [2]:
| https://en.wikipedia.org/wiki/Electromagnetically_induced_ac...
| mycall wrote:
| I jokingly refer to my GPU's coil whine as a HDD emulator as
| it sounds like a spinning disk to me.
| RGamma wrote:
| Depending on workload my mainboard's VRMs (I think) sound
| like birds chirping.
| magicalhippo wrote:
| The voltage regulators are really just buck
| converters[1], one per phase, and so contain both
| capacitors and inductors in a highly dynamic setting.
|
| Thus they're very prone to the effects mentioned.
|
| [1]: https://en.wikipedia.org/wiki/Buck_converter
| dapperdrake wrote:
| There was a Lenovo docking station compatible with the T420
| (around 2012). The headphone Jack had shrieky noise on it
| whenever the Gigabit Ethernet connection got going. Took a
| while to figure that one out.
|
| The other two docking stations were built differently and had
| the wires run somewhere else.
|
| EDIT: Typo
| jazzyjackson wrote:
| I have a dell laptop that makes a high pitched whine whenever
| I'm scrolling. really strange. GeForce RTX 2060.
| yencabulator wrote:
| The connection to scrolling is usually increased activity
| on the bus. CPU feeding the graphics card new bitmaps is
| just one source of that traffic. In a previous job, I could
| tell whether a 3-machine networking benchmark was still
| running based on the sound.
| ysleepy wrote:
| Some computers ship with an implementation of this using
| inductor coils which whine depending on the load :D
| wglb wrote:
| I remember doing this back on the IBM 1130. We didn't see that
| there was any value in using it as a monitor.
| philiplu wrote:
| Much of my earliest computer experience was on an 1130 clone,
| the General Automation 18/30. Never did the AM radio thing,
| but you could see what phase the Fortran compiler was up to
| just by the general look of the blinkenlights.
| jjkaczor wrote:
| My favourite was troubleshooting an already ancient original
| IBM XT in 1992... The operator was complaining about the screen
| giving her headaches.
|
| Sure enough - when I went onsite to assist, that "green" CRT
| was so incredibly wavy, I could not understand how she could do
| her job at all. First thing I tried was moving it away from the
| wall, in-case I had to unplug it to replace.
|
| It stopped shimmering and shifting immediately. Moved it back -
| it started again.
|
| That's when we realised that her desk was against a wall hiding
| major electrical connectivity to the regional Bell Canada
| switching data-centre on the floors above her.
|
| I asked politely if she could have her desk moved - but no...
| that was not an option...
|
| ... So - I built my first (but not last) solution using some
| tin-foil and a cardboard box that covered the back and sides of
| her monitor - allowing for airflow...
|
| It was ugly - but it worked, we never heard from her again.
|
| My 2nd favourite was with GSM mobile devices - and my car radio
| - inevitably immediately before getting a call, or TXT message
| (or email on my Blackberry), if I was in my car and had the
| radio going, I would get a little "dit-dit-dit" preamble and
| know that something was incoming...
|
| (hahahaha - I read this and realise that this is the new "old
| man story", and see what the next 20-30 years of my life will
| be, boring the younglings with ancient tales of tech
| uselessness...)
| glhaynes wrote:
| _My 2nd favourite was with GSM mobile devices - and my car
| radio - inevitably immediately before getting a call, or TXT
| message (or email on my Blackberry), if I was in my car and
| had the radio going, I would get a little "dit-dit-dit"
| preamble and know that something was incoming..._
|
| I remember this happening all the time in meetings. Every few
| minutes, the conversation would stop because a phone call was
| coming in and all the conference room phones started buzzing.
| One of those things that just fades away so you don't notice
| it coming to an end.
| mrandish wrote:
| At a certain large tech company, at the peak of GSM data-
| capable phones (eg Palm Treo, Blackberry) it was accepted
| practice for everyone turn off data or leave their phone on
| a side table due to conference speaker phones on the tables
| amplifying the data.
|
| Also, during this era I was on a flight and the pilot came
| over the PA right before pushing from the gate saying
| exasperatedly "Please turn your damn phones off!" (I
| assumed the same RF noise was leaking into his then-
| unsheilded headset).
| MBCook wrote:
| The "GSM chirp". I never got to hear it much because my
| family happened to use CDMA phones in that era, but I do
| remember hearing it a few times. I know it was well known.
|
| I haven't thought about that in a long time.
| mikewarot wrote:
| I had an IBM in a Electrical Generating Station that had a
| HUGE 480 volt 3 phase feed not far from the monitor in the
| next room. The display would swirl about 1/2 of an inch at
| quite a clip.
|
| The solution I picked was to put the machine in Text/Graphics
| mode (instead of normal character rom text mode, this was
| back in the MS-DOS days), so the vertical sync rate then
| matched up with the swirling EM fields, and there was almost
| zero apparent motion.
| vincent-manis wrote:
| Many years ago, I was a user of an IBM 1130 computer system. It
| filled a small room, with (as I recall) 16K 16-bit words and a
| 5MB disk drive, which was quite noisy. I would feed in my
| Fortran program, then head down to the other end of the
| building to get coffee. The computer would think about the
| program for a while, and then start writing the object file.
| This was noisy enough that I'd hear it, and head back to the
| computer room just in time to see the printer burst into
| action.
|
| (Please feel free to reference the Monty Python "Four
| Yorkshiremen" sketch. But this really happened.)
| simne wrote:
| > 10^5 flip-flops is about 12.2KiB of memory
|
| It is reasonable amount for CISC CPU cache, as for CISC normal to
| have small number of registers (RISC is definitely multiple-
| register machine, as memory operations are expensive with RISC
| paradigm).
| Someone wrote:
| I don't think that's correct. Ideally, your working set fits
| into the cache, and that doesn't get smaller or larger when you
| use a RISC CPU.
|
| (There may be an effect because of instruction density, but
| that's not very big, and will have less of an effect the larger
| your working set)
| simne wrote:
| > Ideally, your working set fits into the cache
|
| Ideally, your working set fits into the registers, and that
| was most win feature or RISC, having relatively simple
| hardware (if compared to genuine CISC, like IBM-360
| mainframes) they saved significant space on chip for more
| registers.
|
| - Register-register operations are definitely avoid memory
| wall, so data cache become insignificant.
|
| Sure, if we compare comparable, not apples vs carrots.
| simne wrote:
| If you are unlucky, so your working set don't fit into the
| registers, in this case you sure will deal with memory bus
| and with cache hierarchy. But best if you just don't touch
| memory at all, making all processing in your code in
| registers.
| mikewarot wrote:
| If you've read the article, and thought to yourself... _I wonder
| what it was like back then, and if there might be some other
| direction it could have gone_ , oh boy do I have a story (and
| opportunity) for you!
|
| It's my birthday today(61)... I'm up early to get a tooth pulled,
| and I read this wonderful story, and all I have is _a tangent I
| hope some of you think is interesting._ It would be nice if
| someone who isn 't brain-foggy could run with the idea and make
| the Billion dollars or so I think can be wrung out of it. You get
| a bowl of spaghetti, that _could_ contain the key to Petaflops,
| secure computing, and a new universal solvent of computing like
| the Turing machine, as an instructional model.
|
| The BitGrid is an FPGA without all the fancy routing hardware,
| that being replace with a grid of latches... if fact, the whole
| chip would consist almost entirely of D flip-flops and 16:1
| multiplexers. (I lack the funds to do a TinyTapeout, but started
| going there should the money appear)
|
| All the computation happens in cells that are 4 bit in, 4 bit out
| Look up tables. (Mostly so signals can cross without XOR tricks,
| etc) For the times you need a chunk of RAM, the closest I got is
| using one of the tables as a 16 bit wide shift register, which
| I've decided to call isolinear memory[6]
|
| You can try the online React emulator I'm still working on [1],
| and see the source[2]. Or the older one I wrote in Pascal, that's
| MUCH faster[3]. There's a writeup from someone else about it as
| an Esoteric Language[4]. I've even written a blog about it over
| the years.[5] It's all out in public, for decades up to the
| present... so it should be easy to contest any patents, and keep
| it fun.
|
| I'm sorry it's all a big incoherent mess.... completely replacing
| an architecture is _freaking hard_ , especially when confronted
| with 79 years of optimization in another direction. I do think
| it's possible to target it with LLVM, if you're feeling frisky.
|
| [1] https://mikewarot.github.io/bitgrid-react-app/
|
| [2] https://github.com/mikewarot/bitgrid-react-app
|
| [3] https://github.com/mikewarot/Bitgrid
|
| [4] https://esolangs.org/wiki/Bitgrid
|
| [5] https://bitgrid.blogspot.com/
|
| [6] https://bitgrid.blogspot.com/2024/09/bitgrid-and-
| isolinear-m...
| NetOpWibby wrote:
| Happy birthday!
|
| I have no idea what you're talking about but I'm gonna follow
| these links.
| gradschool wrote:
| I don't like to rain on anyone's parade, but this is at least
| the third time I've seen one of these comments, and if it were
| me I'd want someone to point out a drawback that these young
| whippersnappers might be too respectful to mention. Isn't it a
| truism in modern computer architecture that computation is
| cheap whereas communication is expensive? That "fancy routing
| hardware" is what mitigates this problem as far as possible by
| enabling signals to propagate between units in the same clock
| domain within a single clock period. Your system makes the
| problem worse by requiring a number of clock periods at best
| directly proportional to their physical separation, and worse
| depending on the layout. If I were trying to answer this
| criticism, I'd start by looking up systolic arrays (another
| great idea that never went anywhere) and finding out what
| applications were suited to them if any.
| kije wrote:
| Not sure why you're saying that systolic arrays never went
| anywhere. They're widely used for matrix operations in many
| linear algebra accelerators and tensor units (yes, largely in
| research), but they are literally the core of Google's TPU
| [1] and AWS EC2 Inf1 instances [2].
|
| [1] https://cloud.google.com/blog/products/ai-machine-
| learning/a...
|
| [2] https://aws.amazon.com/blogs/aws/amazon-ecs-now-supports-
| ec2...
| glitchc wrote:
| Sorry to burst your bubble but modern FPGAs are already
| designed this way (latches paired with muxes and LUTs across
| the lattice). Take a look at the specs for a Xilinx Spartan
| variant. You still need the routing bits because clock skew and
| propagation delay are real problems in large circuits.
| mikewarot wrote:
| Yes, I know about clock skew and FPGAs. They are an
| optimization for latency, which can be quite substantial and
| make/break applications.
|
| The goal of an FPGA is to get a signal through the chip in as
| few nanoSeconds as possible. It would be insane to rip out
| that hardware in that case.
|
| However.... I care about throughput, and latency usually
| doesn't matter much. In many cases, you could tolerate
| startup delays in the milliseconds. Because everything is
| latched, clock skew isn't really an issue. All of the lines
| carrying data are short, VERY short, thus lower capacitance.
| I believe it'll be possible to bring the clock rates up into
| the Gigahertz. I think it'll be possible to push 500 Mhz on
| the Skywater 130 process that Tiny Tapeout uses. (Not outside
| of the chip, of course, but internally)
|
| [edit/append]
|
| Once you've got a working BitGrid chip rated for a given
| clock rate, it doesn't matter how you program it, because all
| logic is latched, the clock skew will always be within
| bounds. With an FPGA, you've got to run simulations, adjust
| timings, etc and wait for things to compile and optimize to
| fit the chip, this can sometimes take a day.
| glitchc wrote:
| Latency is only part of the problem. Unmanaged clock skew
| leads to race conditions, dramatically impacting circuit
| behaviour and reliability. That's the reason the industry
| overwhelmingly uses synchronized circuit designs and clocks
| to trigger transitions.
| simne wrote:
| The goal of an FPGA is to get cheap prototype and/or for
| applications which don't need large batches, enough to ASIC
| become cost effective.
|
| Low latency links are just byproduct, added to improve FPGA
| performance (which is definitely bad if compare to ASIC),
| you don't have to use them in your design.
|
| If you will make working FPGA prototype, it would be not
| hard to convert them to ASIC.
|
| Happy birthday!
| simne wrote:
| BTW have you hear about asynchronous FPGA programming (and
| asynchronous HDL design)? It was very popular subject some
| years ago.
| WillAdams wrote:
| For those who don't recognize the title, this is a reference to
| the classic:
|
| https://www.goodreads.com/book/show/7090.The_Soul_of_a_New_M...
|
| which was one of the first computer books I ever read --- I
| believe in an abbreviated form in _Reader's Digest_ or in a
| condensed version published by them (can anyone confirm that?)
|
| EDIT: or, maybe I got a copy from a book club --- if not that,
| must have gotten it from the local college after prevailing upon
| a parent to drive me 26 miles to a nearby town....
| bayouborne wrote:
| "I am going to a commune in Vermont and will [In my mind I've
| always heard a 'henceforth' inserted here for some reason] deal
| with no unit of time shorter than a season"
|
| One of my favorite quotes in the book - when an overworked
| engineer resigns from his job at DG. The engineer, coming off a
| death march, leaves behind a note on his terminal as his letter
| of resignation. The incident occurs during a period when the
| microcode and logic were glitching at the nanosecond level.
| rjsw wrote:
| He didn't join a commune though, still working [1].
|
| [1] http://www.polybus.com/hdlmaker/Resume.html
| cmrdporcupine wrote:
| Wow, I love that web page / resume and that era of HTML
| authorship. Brings back memories.
| ghaff wrote:
| I don't know about an abridged version but it's one of the best
| books about product development ever written. I actually
| dotted-lined to Tom West at one point though a fair bit after
| the events of "the book." (Showstopper--about Windows NT--is
| the other book I'd recommend from a _fairly_ similar era from
| the perspective of today.)
| ConfiYeti wrote:
| Thanks for sharing this. I forgot to add the link in the
| original post.
|
| I also highly recommend the TV show Halt and Catch Fire. It's
| not related to the book but very similar spiritually.
| ghaff wrote:
| While far from perfect, Halt and Catch Fire definitely
| captured a lot of the spirit of the early PC industry at
| about the same time (early 80s).
| ngai_aku wrote:
| The Halt and Catch Fire Syllabus[0] has a lot of awesome
| content worth checking out as well.
|
| [0] https://bits.ashleyblewer.com/halt-and-catch-fire-
| syllabus/
| Angostura wrote:
| The full length version is a _really_ good read
| bayouborne wrote:
| Indeed, one of the more memorable set pieces in chapter 1:
|
| "He traveled to a city, which was located, he would only say,
| somewhere in America. He walked into a building, just as
| though he belonged there, went down a hallway, and let
| himself quietly into a windowless room. The floor was torn
| up; a sort of trench filled with fat power cables traversed
| it. Along the far wall, at the end of the trench, stood a
| brand-new example of DEC's VAX, enclosed in several large
| cabinets that vaguely resembled refrigerators. But to West's
| surprise, one of the cabinets stood open and a man with tools
| was standing in front of it. A technician from DEC, still
| installing the machine, West figured.
|
| Although West's purposes were not illegal, they were sly, and
| he had no intention of embarrassing the friend who had given
| him permission to visit this room. If the technician had
| asked West to identify himself, West would not have lied, and
| he wouldn't have answered the question either. But the moment
| went by. The technician didn't inquire. West stood around and
| watched him work, and in a little while, the technician
| packed up his tools and left.
|
| Then West closed the door, went back across the room to the
| computer, which was now all but fully assembled, and began to
| take it apart.
|
| The cabinet he opened contained the VAX's Central Processing
| Unit, known as the CPU--the heart of the physical machine. In
| the VAX, twenty-seven printed-circuit boards, arranged like
| books on a shelf, made up this thing of things. West spent
| most of the rest of the morning pulling out boards; he'd
| examine each one, then put it back.
|
| ..He examined the outside of the VAX's chips--some had
| numbers on them that were like familiar names to him--and he
| counted the various types and the quantities of each. Later
| on, he looked at other pieces of the machine. He identified
| them generally too. He did more counting. And when he was all
| done, he added everything together and decided that it
| probably cost $22,500 to manufacture the essential hardware
| that comprised a VAX (which DEC was selling for somewhat more
| than $100,000). He left the machine exactly as he had found
| it."
| HarHarVeryFunny wrote:
| That reminds me of the US, during the cold war,
| intercepting the soviet "Lunik" satellite, in transit by
| truck, which was being exhibited in the US(!), and
| overnight completely disassembling/reassembling it before
| letting it go on it's way with the soviets none the wiser.
| throw0101a wrote:
| > _" Lunik" satellite_
|
| * https://www.cia.gov/readingroom/collection/lunik-loan-
| space-...
|
| * https://greydynamics.com/the-lunik-plot-how-the-cia-
| hijacked...
| shon wrote:
| Such a great book... should be required reading for anyone
| managing engineers.
| Robotenomics wrote:
| There is a post Where are they now. In Wired magazine about the
| team 20 years later https://www.wired.com/2000/12/eagleteam/
| favorited wrote:
| I'm about 75% through the audiobook, and it's absolutely
| fantastic.
|
| The most surprising thing so far is how advanced the hardware
| was. I wasn't expecting to hear about pipelining, branch
| prediction, SIMD, microcode, instruction and data caches, etc.
| in the context of an early-80s minicomputer.
| adamc wrote:
| It's also the best non-fiction book I've ever read. And won the
| Pulitzer, I think.
| drcwpl wrote:
| This is a great read. von Neumann was pivotal in the design of
| architecture, I think his contribution is way under appreciated -
| did you know he also wrote a terrific book The Computer and the
| Brain and coined the term the Singularity?
| https://onepercentrule.substack.com/p/the-architect-of-tomor...
| defphysics wrote:
| I see this in wikipedia:
|
| "The attribution of the invention of the architecture to von
| Neumann is controversial, not least because Eckert and Mauchly
| had done a lot of the required design work and claim to have
| had the idea for stored programs long before discussing the
| ideas with von Neumann and Herman Goldstine[3]"
|
| https://en.wikipedia.org/wiki/Von_Neumann_architecture
| jazzyjackson wrote:
| Yes, von Neumann was tasked with doing a write up of what
| Eckert and Mauchly were up to in the course of their contract
| building the eniac/edvac. meant to be an internal memo.
| goldstein leaked the memo and the ideas inside were
| attributed to the author, von Neumann. this prevented any of
| the work being patented btw, since the memo served as prior
| art.
|
| The events are covered in great detail in Jean Jennings
| Bartik's autobiography "Pioneer Programmer", according to her
| von Neumann really wasn't that instrumental to this
| particular project, nor did he mean to take credit for things
| -- it was others that were big fans of his that hyped up his
| accomplishments.
|
| I attended a lecture by Mauchly's son, Bill, "History of the
| ENIAC", he explains how eniac was a dataflow computer that,
| when it was just switches and patch cables, could do
| operations in parallel. There's a DH Lehmer quote, "This was
| a highly parallel machine, before von Neumann spoiled it."
| https://youtu.be/EcWsNdyl264
| mannykannot wrote:
| I like the idea of identifying 'bit flips' in papers, which are
| (if I am following along) statements which precipitate or
| acknowledge a paradigm shift.
|
| Perhaps the most important bit-flip of this paper's time (and
| perhaps first fully realized in it) might be summarized as
| 'instructions are data.'
|
| This got me thinking: today, we are going through a bit-flip that
| might be seen as a follow-on to the above: after von Neumann,
| programs were seen to be data, but different from problem/input
| data, in that the result/output depends on the latter, but only
| through channels explicitly set out by the programmer in the
| program.
|
| This is still true with machine learning, but to conclude that an
| LLM is just another program would miss something significant, I
| think - it is training, not programming, that is responsible for
| their significant features and capabilities. A computer
| programmed with an untrained LLM is more closely analogous to an
| unprogrammed von Neumann computer than it is to one running any
| program from the 20th. century (to pick a conservative tipping
| point.)
|
| One could argue that, with things like microcode and virtual
| machines, this has been going on for a long time, but again, I
| personally feel that this view is missing something important -
| but only time will tell, just as with the von Neumann paper.
|
| This view puts a spin on the quote from Leslie Lamport in the
| prologue: maybe the future of a significant part of computing
| _will_ be more like biology than logic?
| Pannoniae wrote:
| And the next paradigm shift _after_ that will probably be
| "programs' own outputs are their input"
| cmontella wrote:
| That sounds like a feedback control loop. That's the basis of
| a programming language I'm writing ^_^
| mk_stjames wrote:
| This is essentially the paradigm what Karpathy deemed to be
| "Software 2.0" in an article in 2017:
|
| https://karpathy.medium.com/software-2-0-a64152b37c35
| tromp wrote:
| > We should not accept this. If we don't, then the future of
| computing will belong to biology, not logic.
|
| In this quote from Leslie Lamport, I took "If we don't" to mean
| "If we don't accept this". But the rest of the sentence made no
| sense then.
|
| Could it be a really awkward way to say: If we don't "not accept
| this", i.e. if we accept this?
| MBCook wrote:
| I think you're right, it means something like "if we don't
| refuse to accept this".
|
| That is rather awkward isn't it.
|
| I really liked that whole block quote though.
| yencabulator wrote:
| Maybe an earlier draft said "We should reject this. If we don't
| [...]", and was edited to be less harsh.
| ctenb wrote:
| I just realized that the word for organ the music instrument and
| the body part is one and the same in English. (In dutch they're
| called orgel and orgaan respectively.) Which of these meanings is
| being referred to in the article? To me both could make sense.
| tomesco wrote:
| The definition of organ in this article is closest to the body
| part definition. The use of organ here relies on a less common
| definition: roughly a constituent part of some larger whole
| that performs a specific function.
| MBCook wrote:
| I think that's a great summary.
|
| As a native English speaker, it was understandable but feels
| rather foreign because you never hear parts of computers
| referred to that way these days.
| bloody_bocker wrote:
| > When people who can't think logically design large systems,
| those systems become incomprehensible. And we start thinking of
| them as biological systems. And since biological systems are too
| complex to understand, it seems perfectly natural that computer
| programs should be too complex to understand.
|
| For some time I've been drawing parallels between ML/AI and how
| biology "solves problems" - evolution. And I also am bit
| disappointed by the fact the future might lead us in a different
| direction than mathematical elegance of solving problems.
| l33tbro wrote:
| Are you able to share what these parallels are that you've
| drawn? I've always thought them to be slightly different
| processes.
| MBCook wrote:
| I really loved that quote in the article. It's such an
| excellent description of the "computer voodoo" users come up
| with to explain what is, to them, unexplainable about
| computers.
|
| You're right though, we're basically there with AI/ML aren't
| we. I mean I guess we know why it does the things it does in
| general, but the specific "reasoning" on any single question is
| pretty much unanswerable.
___________________________________________________________________
(page generated 2024-11-12 23:00 UTC)