[HN Gopher] Why does the Commodore C128 perform poorly when runn...
___________________________________________________________________
Why does the Commodore C128 perform poorly when running CP/M?
Author : xeeeeeeeeeeenu
Score : 69 points
Date : 2022-12-09 08:14 UTC (14 hours ago)
(HTM) web link (retrocomputing.stackexchange.com)
(TXT) w3m dump (retrocomputing.stackexchange.com)
| kazinator wrote:
| From top answer:
|
| > _each character is compared to every possible control character
| before it is decided that it isn 't one_
|
| That's crazy; ASCII was carefully designed to avoid this kind of
| shit.
| Dwedit wrote:
| Ah the interrupt handling a bunch of unnecessary things... Z80
| has IM 2, you set a page-aligned 256 byte buffer, point the I
| register, enter Interrupt Mode 2, then you have your own custom
| interrupt handler instead of the one provided by the ROM.
| jerf wrote:
| A good reply to the "we should write more efficient code today,
| like we used to do in the Good Ol' Days". Inefficient code is not
| a new phenomenon!
| robinsonb5 wrote:
| Or perhaps an indication that the C128 was ahead of its time!
| ;)
|
| Interesting that some of the slowdowns are directly
| attributable, even back then, to greater abstraction in the
| newer CP/M BDOS.
| ajross wrote:
| No, but it's possible to make things harder on yourself than
| you need to also. CP/M had a "terminal" abstractions for some
| good reason, but it also means that every byte of output needs
| to go through functions to "handle" it[1] but then Commodore
| had to shoehorn that onto a video controller chip designed for
| a very different bus[2], making everything slower still.
|
| The C128 was... it was just a bad computer, I'm sorry. This was
| the swan song for the 8 bit machine, the market was moving very
| rapidly onward in 1985 as it was released (within 3 years or so
| it would be routine to see new personal devices quoted with
| _multiple megabytes_ of fully addressable RAM and 256kb+ color
| framebuffers).
|
| Commodore threw together what they could to keep the C64 band
| together for one more tour, but... let's be honest, their last
| album sucked.
|
| [1] vs. software for contemporary PC's that had largely
| abandoned the equivalent MS-DOS abstractions and was blasting
| bytes straight into the MDA or CGA text buffers. You could
| technically do that on the 128 also, but that would mean losing
| compatibility with a decade of CP/M software designed for other
| terminals, which was the only reason for having the Z80 there
| in the first place.
|
| [2] The 6502 had an essentially synchronous bus where the data
| resulting from the address lines latched at the rising edge of
| the clock was present when the clock fell. The Z80 had a
| 3-cycle sequence that needed to be mediated by glue logic to
| fit what the VIC-II was expecting, and it didn't fit within a
| single cycle. That's the reason for the Z80 having to run at
| "half speed", essentially.
| csixty4 wrote:
| > Commodore threw together what they could to keep the C64
| band together for one more tour, but... let's be honest,
| their last album sucked.
|
| Exactly! Bil Herd said it was just supposed to get them
| through a year until the Amiga picked up. Nobody expected the
| 8-bit line to last so long with a multitasking multimedia
| powerhouse out there for under $1000.
| UncleSlacky wrote:
| Amstrad managed to keep selling 8-bit CP/M machines as
| (semi) dedicated word processors well into the 90s:
| https://en.wikipedia.org/wiki/Amstrad_PCW
| pkaye wrote:
| I had an PCW for ages. My dad got it so I could write up
| school reports. Screen was pretty crisp text. Main
| drawback is it uses a 3" floppy which was hard to find.
| Also only after purchase did I realize it was not IBM
| compatible.
| ajross wrote:
| Commodore kept selling the 64C until it entered chapter
| 11 in 1994, actually. It was the 128 that was an
| embarassing dead end, but the 64 had a huge base of
| software and there was still a market for a dirt cheap
| TV-connected home device for all the purposes for which
| it was ever useful.
|
| It's hard to remember how fast things changed in the
| early era of computing. The killer app that finally put
| the knife in the 8 bit home computer was... the web
| browser.
| forinti wrote:
| Indeed, we went from ZX81 in 1981 to Amiga in 1985.
|
| Every new project was obsolete or nearly so at launch.
| forinti wrote:
| Almost all 8-bit micros were hacks made to be as cheap as
| possible. Commodores were no exception.
| ajross wrote:
| In 1985, sure. But even then the 64C was the cost-reduced
| variant, the 128 was the "fancy" 8 bit and almost 2x as
| expensive.
|
| But note that the 128's Z-80 CP/M processor was effectively
| a recapitulation of the Microsoft (yes, that Microsoft)
| Softcard product that they shipped in 1980 for the Apple
| II. The Softcard was a huge hit and an absolute revelation
| to the small handful of users[1] in its world. In point of
| fact it was the single most popular CP/M computer sold
| anywhere.
|
| And yet it was afflicted by most of the same problems the
| 128 was. Everything is about timing. Commodore happened to
| be five (yikes) years late to market with the idea.
|
| [1] This was just a tiny bit before my time. And in any
| case I grew up in an Atari 800 family and wouldn't have
| been caught dead with a II plus.
| Baeocystin wrote:
| I mean... You're not objectively wrong when looking at it
| overall. I pretty much never used CP/M mode besides a few
| times for WordStar.
|
| But 14-year-old me loved his tremendously. 80-column mode
| felt so futuristic! BBSes looked amazing! And BASIC 7 was a
| lot more fun than BASIC 2. I felt like I was getting away
| with something when I learned I could enable 2MHz mode in
| subroutines, and if they were fast enough, the screen
| wouldn't even flicker before I reverted to VIC-capable
| speeds. Good times.
| jerf wrote:
| "Commodore threw together what they could to keep the C64
| band together for one more tour, but... let's be honest,
| their last album sucked."
|
| I've sometimes seen people say "Oh, modern computing sucks so
| much, if only the Commodore line had continued on and been
| the basis for computing instead of the IBM line"... and I
| have no idea what they mean. Even the Amiga may have been
| neat for the time, but trying to project that line forward
| all the way into 2022... well, whatever would have gotten
| here probably wouldn't be all that different than what we
| have now.
|
| Commodore did some interesting things. They lost for reasons,
| though. Even the Amiga.
| NovaVeles wrote:
| While the Amiga was impressive in the 80's, they failed to
| continue on that path of technical innovation. The
| technical lead they provided simply vanished by having
| everyone else move forwards just a little bit faster than
| they were. Thus the inevitable erosion of their space. By
| the mid 90's there was little compelling that gave the
| Amiga an advantage and thus they had their lunch eaten. I
| suspect if we were to project the Amiga forward to today it
| would have actually meant us being in a worse place than we
| are today.
|
| In the same way if we had of waited for Silicon Graphics to
| provide desktop 3D graphics, we would have been waiting for
| a LONG time and at a high price.
| rbanffy wrote:
| > waited for Silicon Graphics to provide desktop 3D
| graphics
|
| OTOH, we wouldn't need to put up with Windows for so long
| ;-)
| tssva wrote:
| > While the Amiga was impressive in the 80's, they failed
| to continue on that path of technical innovation.
|
| The US market was the largest computer market at the time
| and the Amiga sales in the US were awful. This left
| Commodore financially incapable of continuing on the path
| of technical innovation. Even in the few countries where
| the Amiga could have been considered a success, UK,
| Germany and Italy, the vast majority of sales were for
| the low end and margin A500.
| actionfromafar wrote:
| They trashed so many of the opportunities they had, too.
| actionfromafar wrote:
| I think people would have sat up and noticed if they took
| away the real-time aspects of the platform. On paper, there
| isn't much about Amiga which was real-time. Culturally
| though, _so_ _many_ _programs_ were realtime. I still miss
| the instant latency. iOS, Android, MacOS etc nothing holds
| a candle to how it felt.
| robinsonb5 wrote:
| This. There were plenty of flaws in the Amiga experience
| (like "please insert disk..." dialogs that simply won't
| take no for an answer because the programmer didn't check
| for errors) but that instant feedback is what's missing
| from so many modern systems. I call it "having the
| computer's full attention" - and it comes from the fact
| that (using the stock GUI system) the gadgets which make
| up the UI are drawn and redrawn in response to input
| events on the "intuition" task rather than whenever the
| application's main loop next gets around to it.
|
| That's why Magic User Interface felt "wrong" on the
| Amiga, despite being a superb toolkit in lots of ways* -
| drawing was relegated to the application's task, making
| it feel "Windowsey".
|
| (*I would genuinely like to see a port of MUI to X /
| Linux.)
| rbanffy wrote:
| > Even the Amiga.
|
| To me, the biggest tragedy was when Commodore refused to
| allow Sun to package the A3000 as a low-end Unix
| workstation.
|
| It could have changed the world.
|
| I regard the C128 as a very flawed project - even more of a
| kludge than the Apple IIgs (a 16-bit computer built around
| an 8-bit one from the previous decade, kind of a 16-bit
| Apple /// with a //c inside and two completely different
| ways to output pixels, plus a full professional synthesizer
| attached somewhere in there - fun machine, but don't look
| too close).
|
| They could have used the effort that went into the 128 to
| beef up the VICII with a TED-like pallette and have a
| brilliant last C-64 computer to show.
| JohnBooty wrote:
| Apple IIgs (a 16-bit computer built around an
| 8-bit one from the previous decade, kind of a
| 16-bit Apple /// with a //c inside and two
| completely different ways to output pixels, plus
| a full professional synthesizer attached
| somewhere in there - fun machine, but don't look
| too close).
|
| I feel like despite its weird nature, that you accurately
| describe, it _still_ would have been a really legit
| competitor had they simply given it a faster 65C816.
|
| It simply didn't have the CPU horsepower to really do the
| things it was otherwise capable of... the Mac-like GUIs,
| the new graphical modes in games, etc.
|
| When you use a IIGS with one of the accelerator cards
| (Transwarp, etc) that gives it a faster 65C816 (7mhz,
| 10mhz, etc)you really get a sense of what the IIGS could
| have been. Because at that point it was a solid Atari ST
| / Amiga peer (very roughly speaking, lots of feature
| differences)
|
| It's not entirely clear to me why it didn't have a faster
| 65C816 to begin with. I know there was a lot of
| competition inside Apple between the AppleII and Mac
| divisions, and I suspect the IIGS was hobbled so that it
| didn't eclipse the entry level Macs. However I'm also not
| sure if higher-clocked 65C816 chips were even available
| at the time the IIgs was designed and released, so maybe
| it wasn't an option.
| classichasclass wrote:
| > I know there was a lot of competition inside Apple
| between the AppleII and Mac divisions, and I suspect the
| IIGS was hobbled so that it didn't eclipse the entry
| level Macs.
|
| That's _exactly_ why. Apple didn 't want the GS cutting
| into Mac sales.
| ajross wrote:
| > To me, the biggest tragedy was when Commodore refused
| to allow Sun to package the A3000 as a low-end Unix
| workstation.
|
| This is another bit of Amiga lore that is a little spun.
| There was no space in the market for a 68030 Unix machine
| in 1992 though. Even SPARC's days were numbered. Not two
| years later we were all running Linux on Pentiums, and we
| never looked back.
| mixmastamyk wrote:
| I wasn't aware of this at the time, but they had AT&T
| Unix on them:
|
| https://en.wikipedia.org/wiki/Amiga_3000UX
|
| $3400 in 1990 is pretty high, but I guess cheap for a
| "workstation." The average business would probably just
| suffer on a PC with DOS and Lotus 123 or something.
| kevin_thibedeau wrote:
| > The average business would probably just suffer on a PC
| with DOS and Lotus 123 or something.
|
| 386 PCs were going for $3500+ in 1990.
| einr wrote:
| _They could have used the effort that went into the 128
| to beef up the VICII with a TED-like pallette and have a
| brilliant last C-64 computer to show._
|
| The Commodore 65 was going to be that, kind of. It really
| is a shame that it never saw the light of day. At least
| now we have the MEGA65 for some alternate history
| retrocomputing.
| pcwalton wrote:
| > even more of a kludge than the Apple IIgs (a 16-bit
| computer built around an 8-bit one from the previous
| decade)
|
| Well, on the other hand, it did give us the 65816, which
| the SNES was built on top of.
|
| On the other other hand, though, it probably would have
| been better if the SNES had gone with m68k.
| rbanffy wrote:
| The 65816 was as much a dead end as the 8086.
| Unfortunately, Apple saw that and discontinued the GS.
| IBM, unfortunately, doubled down with the 286, then 386
| and here we are.
|
| Notably, IBM had a 68000 desktop computer that ran Xenix,
| the IBM CS 9000.
|
| https://archive.org/details/byte-
| magazine-1984-02/page/n279/...
| ajross wrote:
| > The 65816 was as much a dead end as the 8086.
|
| The 8086 was a... what now?
| rbanffy wrote:
| It's a horrible architecture. Every iteration on top of
| it made it even more hideous. It's a unnatural thing that
| should have never been. Every time an x86 powers up, it
| cries in despair and begs to be killed.
| pcwalton wrote:
| It's funny to see this sentiment wax and wane with
| Intel's market fortunes. In the early 2010s on this site,
| with Intel riding high, you'd see lots of comments
| talking about how x86 was just months away from killing
| ARM for good on account of a better technical design. Now
| it's the reverse: people are treating x86 as a horrible
| dinosaur of an architecture that's on its last legs.
|
| Personally, I think x86 is pretty ugly, but both
| compilers and chip designers have done an admirable job
| taming it, so it's not that bad. Process/fabrication
| concerns tend to dominate nowadays.
| robocat wrote:
| Your complaint is the designer's lament, regrets because
| fast iterations will usually beat a cohesive development,
| the struggle to accept that the imperfect delivered today
| wins versus the perfect delivered tomorrow. A good
| engineer chooses their battles, fights for the right
| compromises, and grudgingly accepts that the evolutionary
| remnants in all our engineering artefacts are the price
| we pay for the results of iterative path dependency. For
| example, Intel didn't design the x64 ISA.
| Practice beats perfection The engineer's game we
| play Make do with what we've got And make it
| work today
|
| If you watch Jim Keller talk about microprocessor design,
| he seems to say that designs converge due to real world
| constraints and factors. Human designers are imperfect,
| which Jim seems to be very good at acknowledging. Every
| now and then we get to refactor, and remove some limiting
| kludge. But the number of kludges increases with
| complexity, and kludges are the result of compromises
| forced by externalities, so the nirvana of a kludge-free
| world can only be reached in engineer's fairy tales.
| Disclaimer: I was an engineer type, but turned dark by
| the fruits of capitalism. (Edited)
| jabl wrote:
| Meh. Just hold your nose when the processor boots until
| it gets into protected or long mode and you're good to
| go. If ARM or RISC-V eventually take over the x86 market,
| it will be due to business model reasons and not the
| superiority of the ISA.
| skissane wrote:
| > Every time an x86 powers up, it cries in despair and
| begs to be killed.
|
| Intel/AMD could make their new CPUs start in protected
| mode, even long mode - and nowadays, likely nobody other
| than BIOS developers would ever notice. Why don't they? I
| guess it would be a fair amount of work with no clear
| benefit.
|
| One thing they could do - legacy features (real mode,
| 16-bit protected mode, V86 mode, etc) are now an optional
| feature for which you have to pay a premium. They could
| just have a mode which disables them, and fuse that mode
| on in the factory. With Microsoft's cooperation, Windows
| 11 could be made to run on such a CPU without much work.
| Likely few would demand the optional feature, and it
| would soon disappear
| pcwalton wrote:
| > One thing they could do - legacy features (real mode,
| 16-bit protected mode, V86 mode, etc) are now an optional
| feature for which you have to pay a premium. They could
| just have a mode which disables them, and fuse that mode
| on in the factory. With Microsoft's cooperation, Windows
| 11 could be made to run on such a CPU without much work.
|
| Customers who need them could just emulate them. Almost
| everyone already does anyway (DOSBox, etc.)
|
| Though, honestly, I highly suspect the die space spent to
| support those features is pretty small, and power
| dissipation concerns mean that it's quite possible there
| aren't much better uses for that silicon area.
| skissane wrote:
| It inevitably has a cost though, even if at design-time
| than runtime.
|
| Consider a code base filled with numerous legacy features
| which almost nobody ever uses. Their presence in the code
| and documentation means it takes longer for people to
| understand. Adding new features takes more time, because
| you have to consider how they interact with the legacy
| ones, and the risk of introducing regressions or security
| vulnerabilities through those obscure interactions. You
| need tests for all these legacy features, which makes
| testing take more time and be more expensive, and people
| working on newer features break the legacy tests and have
| to deal with that. Technical debt has a real cost - and
| why would that be any any less true for a CPU defined in
| Verilog/VHDL than for a programming language?
|
| And a feature flag is a good way to get rid of legacy
| features. First add the flag but leave it on. Then start
| turning it off by default but let people turn it back on
| for free. Then start charging people money to have it on.
| Then remove it entirely, and all the legacy features it
| gates. It works for software-I can't see why it couldn't
| work for hardware too.
| tssva wrote:
| > unfortunately, doubled down with the 286, then 386 and
| here we are.
|
| It may be a horrible architecture but the fact that we
| are here is a pretty good testament to it not being a
| dead end.
| ajross wrote:
| The 65C816 is another device that I think gets too much
| play. It's a relatively straightforward extension of an 8
| bit idea to 16 bits, with a few (but just a few) more
| registers and some address stretching trickery to get
| more than 64k of (fairly) easily addressible RAM.
|
| Which is to say, the 65C816 did pretty much exactly what
| the 8086 had done, including a bunch of the same
| mistakes[1]. Which would have been fine if these were
| head-to-head competitors. _But the 8086 shipped six years
| earlier!_
|
| [1] Though not quite 1:1. Intel had a 16 bit data bus
| from the start, but the WDC segmentation model was
| cleaner, etc...
| mikepavone wrote:
| > but the WDC segmentation model was cleaner, etc...
|
| Eh, I don't think this is generally true. Intel's
| decision to have the segments overlap so much was
| definitely very short-sighted, but the 8086 has a big
| advantage in having multiple segment registers with the
| ability to override which is used via an instruction
| prefix. This makes copying data between two pages
| straightforward on the 8086 whereas you need to
| constantly reload the databank register on the 65C816.
|
| I think as a pure 16-bit design the 8086 is actually
| pretty good. It's not at all a forward-looking design
| though so it wasn't well suited to being the basis of the
| default PC architecture for decades to come.
| bitwize wrote:
| This is why I say the PC was doomed to win. IBM established
| the PC and then immediately lost control of it because all
| its components except the BIOS were COTS, so the PC
| ecosystem became this mishmash of companies, each
| contributing to and extending the platform as a whole. Any
| one of them can rise and fall and the ecosystem as a whole
| would keep on going.
|
| Commodore was mismanaged and died. Apple was mismanaged and
| nearly died. A proprietary computer ecosystem controlled by
| a single company is at risk of being run into the ground by
| incompetent management or sheer bloody economics. That's
| why the PC and not the Amiga, Old World Mac, or Atari ST
| became the basis of modern computing.
| mixmastamyk wrote:
| Yes, multiple redundancies prevented complete failure
| unlike other platforms.
|
| And the IBM stamp of approval legitimized the PC,
| starting the unstoppable tsunami.
| _the_inflator wrote:
| As much as I loved my C128 D, I only used C128 mode to enter
| ,,GO64".
| classichasclass wrote:
| I'll concede ill-positioned, but the 128 was the logical
| upgrade for 64 owners, not the Amiga. CP/M was always a tack-
| on to deal with the incompatibility of the CP/M 2.2
| cartridge, which is why the Z80 was on the logic board in the
| first place (Bil Herd has explained this in detail). Your
| post basically says the 128 sucks because its CP/M
| implementation does (which I won't dispute), but the 128's
| native mode is a lot stronger than CP/M and you got nearly
| perfect 64 compatibility to boot. That wasn't "bad" as a
| follow-on to a stratospherically popular home computer.
| chinabot wrote:
| Making an efficient program these days means using an efficient
| stack, working with threads to avoid blocked code, fighting the
| other 100+ processes on the CPU for resources and possibly
| avoiding the same on a database server, possibly on the other
| side of the world. In many ways its a lot harder than the early
| 80's where you had no networks, diddnt have to worry about
| security and generally total control of the local OS/CPU.
| rbanffy wrote:
| You forgot the low level parts - properly using the CPU
| caches. Cache thrashing can make even the fastest most
| amazingest CPU look like a Celeron from the 90's.
| greenbit wrote:
| Didn't it use the 80 column screen? It was my experience that the
| C128's 80 column screen was an absolute dog. I was pretty handy
| with 6502 assembly language but trying to get anything onto that
| screen was just disappointingly slow. Maybe I was doing it wrong,
| but it seemed like you had to put every last screen address into
| that display controller, and that controller itself required you
| to select register numbers before putting the values in.
|
| So just to write one character onto the actual screen required
| six writes into the controller chip. Adress-low-register-number
| into D600, screen address LSB into D601. Then address-high-
| register-number into D600, then screen address MSB into D601. Now
| you're ready to write 'A' into the actual screen buffer! So,
| write-data-register-number into D600 and 0x41 into D601!
|
| Do you want to put a 'B' on the screen right after that 'A'?
| Repeat the above - six more writes to get the 'B' showing! Why
| they didn't embed an auto-incrementing screen address counter in
| the controller is beyond reckoning. At least that way you'd only
| have need to set the address once, and could then leave the
| select register (D600) parked on the write-and-increment
| register, and merrily stuffed your display string into D601 with
| a relatively tight loop.
|
| Presumably the Z80 had to put up with the same annoying 80 column
| controller. That can't have helped.
| myrandomcomment wrote:
| So I had a C128D (replacement for C64, which was replacement for
| VIC20). I had also used a few CP/M machines at this point. I know
| why they put the Z80 in the 128 but I always felt it was just a
| stupid thing that the PM/Marketing team should have been told,
| NO. It would have reduced cost, shipping date, etc. This is a
| prime example where marketing made the decision without having a
| real compelling business case for doing so. They could have said
| 100% C64 compatible with and * that said, sorry no CP/M cartridge
| support and it would have been just fine.
___________________________________________________________________
(page generated 2022-12-09 23:01 UTC)