Newsgroups: news.software.b
Path: utzoo!henry
From: henry@utzoo.uucp (Henry Spencer)
Subject: Re: Why aren't new articles compressed?
Message-ID: <1989Oct27.161920.5169@utzoo.uucp>
Organization: U of Toronto Zoology
References: <431@cpsolv.UUCP>
Date: Fri, 27 Oct 89 16:19:20 GMT

In article <431@cpsolv.UUCP> rhg@cpsolv.uucp (Richard H. Gumpertz) writes:
>Why aren't news articles compressed (and decompressed when read or forwarded)?
>Does C news maybe compress them?  It seems like an effective halving of disk
>usage would easily pay for the cycles needed to compress/uncompress.

No, we don't compress them.  In general, we didn't change the way news is
stored; we thought about a whole bunch of possible schemes and concluded
that none of them had enough advantages to be worthwhile.

Compressing lots and lots of small files is very expensive and the degree
of compression is not that impressive.  Admittedly, the quantized allocation
of disk space tends to magnify the effect for such small files, but it's
still a lot of work for limited gain.  It means that an article has to be
decompressed every time it is read, batched for transmission to another
site, or processed in any other way.  The performance impact, on a busy
machine, would be horrendous.  Our perception was that shortening expiry
times is generally a more cost-effective way of economizing on disk.

There is also a pragmatic issue in that it means modifying *all* the news
readers.  There are lots of those, many more than you'd think.
-- 
A bit of tolerance is worth a  |     Henry Spencer at U of Toronto Zoology
megabyte of flaming.           | uunet!attcan!utzoo!henry henry@zoo.toronto.edu
