Subj : Re: data compression on data blocks To : comp.programming From : Joe Butler Date : Mon Aug 29 2005 04:21 pm I'm not an expert, but the answer probably depends a lot on the nature of the data. For, example, if the data was truly random then I don't think it will compress at all. If the data is lots of little line drawings, then run length encoding might do it. Also, I guess there are tradeoff such as what's more important: fast compression or fast decompression; smallest theoretical size or size can be the same granularity as hard disk block size, etc. So, if you give this information, someone else will probably be able to answer better. "David" wrote in message news:dev4o4$vpu$1@news.hispeed.ch... > I have to implement a data compression to compress > a lot of different small data blocks (about 100 to 1000 bytes) > independend of each other. What are good algorithms to do this? > Compression must be lossless. > Thanks in advance > David > > .