Newsgroups: comp.sys.amiga.advocacy
Path: utzoo!utgpu!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!hobbes.physics.uiowa.edu!news.iastate.edu!ux1.cso.uiuc.edu!m.cs.uiuc.edu!storch
From: storch@m.cs.uiuc.edu (Matthew Storch)
Subject: The Amiga's Future
Message-ID: <1991Jun20.075145.22785@m.cs.uiuc.edu>
Summary: A polite flame on Commodore
Organization: University of Illinois, Dept. of Comp. Sci., Urbana, IL
Date: Thu, 20 Jun 1991 07:51:45 GMT

I just had my first look at this newsgroup tonight, and was surprised by the
extreme positions taken up on both sides of arguments concerning future Amiga
video technology.  A few people seemed to be demanding 24-bit graphics, while
others adamantly defended Commodore's minimal improvements in video technology
over the past six years.  I contend that both extremes are untenable positions.
Clearly, it is unreasonable for Commodore to make an Amiga with 24-bit graphics,
32-bit custom chips, additional graphics processors, etc. because as many have
accurately pointed out, the costs are too high.  Costs, of course, are
EVERYTHING in this business, because if you want a machine with the above
capabilities, NO PROBLEM:  go out and buy a Silicon Graphics workstation or
some such for $10,000 -> $20,000 or more.  So anyone making demands for 
unreasonable hardware improvements is, well, unreasonable.

There is another side to the picture, however.  It has now been almost six
(count 'em) SIX years since the release of the original A1000, and there has
been almost no architectural improvements to the Amiga sound and graphics
systems.  In fairness to CBM, I must immediately point two things out,
however.  First, there HAS been substantial improvements in the Amiga's
expansion bus; the A1000 had one 86-pin expansion connector, while the 2000
has:
	the CPU slot with some critical improvements for better coprocessor
		support
	the video slot, which has made all sorts of interesting peripherals
		possible, particularly the FlickerFixer
	an adequate supply of general-purpose expansion slots
	plus the IBM compatibility slots
The 3000 has at least one (I forget exactly how many) 32-bit slot to allow
for 68040 boards and other high-power boards.  And I almost forgot, the
beloved autoconfigure architecture!  The IBM guys are still fighting with
I/O port, interrupt number, I/O window address, etc. collisions between
various pieces of hardware.  For all this Dave Haynie and company deserve
a warm round of applause.

The other thing I need to point out is that the sound, although not 
improved since the A1000, was so good to start with that it's still quite
respectable in 1991.  It is a simple, beautiful, flexible DMA architecture
that does the job well.  It is unreasonable to really go crazy with heavy-
duty sound support; musicians have a better solution in MIDI-controlled
sythesizers.  So, perhaps the sound chips can be enhanced to play 16-bit
samples in addition to (for backward compatibility) the current 8-bit
samples.  Either way it's not that big a deal, since the current sound
is largely "good enough".

Graphics is another matter entirely.  Graphics improvements are VERY
necessary.  The sheer amount of people arguing the issue is testamony
to that.  I can find virtually no excuse for CBM's minimal improvement
in the graphics hardware in six years.  In defence of CBM, and in
argument for some sort of device independent graphics system, Dave
Haynie argues that different classes of Amiga users (DTP people, video
people, games people, etc.) will want different things in the next
generation graphics system, and that it's impossible to please everyone.
With all due respect to Dave, that is a cheap excuse.  OF COURSE it's
impossible to please everyone, especially those who make unreasonable
demands for super-duper 24-bit graphics.  However, it IS reasonable
to expect SOME sort of nontrivial improvement in the architecture in
SIX years.  In those six years the IBM guys have gone from CGA as the
standard (far inferior to the Amiga chipset) to "Super VGA", which
is moderately SUPERIOR to the Amiga chipset, depending on how you
look at it.  I CAN say with very little room for argument that
the IBM versions of many games have comparable graphics to the Amiga
version, and some have BETTER graphics than the Amiga version.  I am
not asking CBM to produce miracles as some people do.  But it is far
past time for SOME SORT OF REASONABLE IMPROVEMENT.  How about
1024x768 in 2 bit planes?  If my trusty calculator is right, that would
be just less than 200K per screen, perfectly workable with even the
current 1 Meg of chip ram.  The ECS may address this point with
SuperHires mode, but
	1) it has been infinitely delayed and STILL isn't being sold
		to the general public
	2) Not having it, I can't be sure, but I get the idea from
		reading some net postings that it's not going to really
		work with the current deinterlacers (the FlickerFixer
		or Commodore's) because the deinterlacers aren't
		designed to sample more than 640 or so pixels on
		one line.  This means your choices are either 1280x200,
		(a hopelessly lopsided aspect ratio), or 1280x400 with
		standard interlace flicker, which I (and most other
		people) find either barely useable or totally unacceptable,
		depending on your exact type of monitor and what sort
		of image you're looking at.
I would be satisfied with 1280x400 DEINTERLACED support in as little as
two bit planes, and with some reasonable amount of memory bandwidth
available for the custom chips to operate on.

How about 640x200 by four planes WITH
NO BLOCKING, so that most of the bandwidth of the blitter is not wasted
waiting for the display hardware?  With the current system, all games
with substantial animation have no realistic hope of running in anything
better than 320x200, but with the blocking removed, 640x200 only costs
twice as much as 320x200 in the same number of planes.  That would make
640x200 games workable with maybe even the current 7MHz CPU, and certainly
feasible with anything faster.

The memory blocking really is a major issue, and one that's not all that
impossible to solve technically.  Even if a general 32-bit version of the
custom chips is too difficult for the guys at CBM, they can at least
double their display-access bandwidth by organizing the memory 32-bits
wide just for the purposes of latching 4 bytes at a time for the
display hardware; the display chip doesn't even need to be 32 bits wide,
just a miserable latch has to be 32 bits wide.  The two 16-bit quantities
could be fed one at a time into the "current" (current for developers
and maybe A3000 owners, that is) SuperHires display chip, and the strain
on chip ram bandwidth is at least halved for the current video modes.

The engineers at CBM probably thought of this at some point (am I right,
Dave?) but for whatever reason CBM just doen't seem to want to improve
the basic graphics hardware.  Instead they have wasted engineering effort
on kluges like the (also infinitely delayed) A2024 "Hedly HiRes" monitor,
and the A2091.  There is nothing wrong with the 2091, but there was almost
no reason for Commodore to waste precious engineering effort on it; by the
time it came out, third party manufacturers had comparable boards.  Why
waste time designing something GVP, etc. already had?  Note that I fully
support the original 2090; it came out with the 2000 and there was no
alternative at the time.  As for the 2024 I think that it's RIDICULOUS
to invest in a proprietary display with no future like the 2024, in light
of the excellent and reasonably priced range of COLOR multisync monitors
available. (Ridiculous from both the users' and CBM's standpoints.)

Device independent graphics are fine, but it is still CRUCIAL to raise
the lowest common denominator, so ALL software companies, most notably
game manufacturers, will take advantage of the better hardware.  Besides,
there is more to device independent graphics than a new graphics.library.
What about the Copper?  Try to pull a 1280x400 screen down on the 
legendary A2024.  That's the price you pay for changing the video
architecture, and not just scaling it up.  Another argument
against having improvements only in
device independent graphics:  what do you do with the A500-type machines?
Yet another problem:  if the boards only work in 2000/3000 type machines,
they will be expensive, especially at first, since a relatively small
number of people will be buying them. 

Dave said that it will be some time before 68030s are cheap enough to
go in A500-class machines.  I agree, but how about a machine like this:
an A500 with 68020 at 14.2xxx MHz, so as not to mess up the system timing
if that's cheaper than an asynchronous design.  1 meg of 32-bit fast
memory for the CPU, and 1 meg of 16-bit chip ram with the above-mentioned
latch trick to alleviate display bandwidth problems.  How about it, anyone
at CBM?  Would it really be THAT expensive?  It can't be that hard to do,
having already designed the A2620, A2630, and A3000.  A corresponding
2000 class machine could be made -- more or less an A2500 with the video
enhancements and the 68020 on the motherboard, as a first step toward
raising the lowest common denominator.  The 500-class machine should put
the Amiga back on the cover of Byte as the first low end machine with a
true 32-bit CPU (if not 32-bit custom chips -- oh well).

Sorry this has been so long, but it has been truely frustrating watching
Amiga graphics hardware go basically nowhere since 1985, while the IBM
clones have gone from CGA to $200 Super-VGA.  (Of course, their problem
is no reasonable software standard whatsoever, so everyone has to pick
a few modes and support them the hard way.)

Finally, on a somewhat different note, WHERE is v2.0??????? In April 1990
at World of Commodore in New York CBM employees told me something like
"early fall" (of 1990!) for the A500/A2000 version.  A3000s have had v2.0
for what seems like forever.  Why not release it already?  It's going to
be impossible to get ALL the bugs out of something as complex as 2.0;
if it's stable enough for the A3000, then sell the damn thing already!
Isn't two years ENOUGH between OS updates?  Couldn't a few features have
been left out in the interests of getting it out the door a mere 3-4
months after the original release timeframe?  (I mean by the end of last
year or so.)  Oh, well, enough complaints already.

	Matt Storch (storch@cs.uiuc.edu)
