2005.09.23 21:11 "[Tiff] Additional Lossless Compression Schemes", by Frank Warmerdam

2005.09.25 04:12 "Re: [Tiff] Additional Lossless Compression Schemes", by Joris Van Damme

looking into a PCD-type scheme. Support for large images is getting more important as the years go by. We've currently got a real problem when

It may seem counter-intuitive, but I think that image file growth may peak in the next few years. The reason for this theory is that CPUs continually grow faster while image pixel counts increase but disk I/O performance has not increased so spectacularly. So the obvious solution is to apply compression. Compression is becoming "cheaper" CPU-wise as the years go by. It will be faster and more efficient to use a compressed file than an uncompressed file.

It's always dangerous to try and predict in this business, but yes, I would say you're making a safe bet, that doesn't seem counter-intuitive at all to me.

But you leave me unsure about where your comment is coming from. Did I succeed in expressing myself badly and confusing my point, yet again? If I said anything that makes you think I oppose good compression, then I was unclear, because I don't, of course. In fact, the proposed scheme of a 'reasonable sized' base image (compressed of course) and bigger delta's (compressed, and better compressed then complete bigger images because of their delta nature) is partly for good compression of large images, while retaining the benefits of tile pyramide storage, and adding the benefit that single-chunk based main stream readers typically will access only the 'reasonably sized' base image and thus not f*k up entirely.

Joris Van Damme
info@awaresystems.be
http://www.awaresystems.be/
Download your free TIFF tag viewer for windows here:
http://www.awaresystems.be/imaging/tiff/astifftagviewer.html