1993.08.17 00:06 "byte swapping 16- and 32-bit data", by Sam Leffler

1993.08.18 01:57 "Re: byte swapping 16- and 32-bit data", by Craig Hockenberry

Dan, et. al.

I don't really have the time to make all the compression routines do the byte swapping efficiently. If I put byte swapping in the library then it will go after the decoding is done which will mean an extra pass over the data.

I really believe that the main reason most people use this library is because of it's correctness, not it's speed.

This is a rare enough case that I don't think the optimization is that critical.

I, for one, am very concerned with the speed of the library. I'm dealing with large (over 50 mb), multi-band imagery. As Sam points out, any operations that can be combined while getting this data to the display makes a big difference.

To date, this has only been done on big-endian machines. We are doing a port on Alpha at the moment...

Many consumers of the data will then pay the expense of the extra pass when they could have rolled it into some other operation,

If there are a significant number of people who would do the extra work to roll the byte swap into another operation, then add a settable option to turn it off.

It doesn't seem right having a special case. It will only work when your data is uncompressed and multi-sample. I'd think that the naive user would really be at a loss to understand why his data comes in one format when it's uncompressed and another when it's compressed.

If someone wants to make this "settable option" completely orthogonal, then I'd be much more inclined to say that this is a good idea. If you ask for byte swapped data, then you should _always_ get it.

An observation: I have found that many applications that read TIFF (without using libtiff) fail with compressed data. These applications include Corel Draw, Word Perfect and the Island products. I've even seen applications that have menus for reading PC TIFF (little-endian) and Mac TIFF (big-endian)! They obviously haven't implemented their code to fully conform with the spec.

In these cases, the best mechanism for transfer is the uncompressed format with a single sample. Having byte swapped, multi-sample data will presumably make it easier for these non-conformant applications to read the file. Compression is not really an issue, nor is speed. The current implementation of the library handles this situation correctly.

I understand the argument about folks not getting this right because they don't test cross-platform portability. I'm very concerned about that--it's one of the main reasons that I wrote the library in the first place.

I think lack of surprises is very important.

I think a simple straightforward unoptimized byteswap after decompression, with an option to disable it would cover everybody's needs and keep the new user surprises to a minimum.

Yes, this seems to be the best way to go. It does assume, however, that there is someone who is willing to add the byte swapping after decompression. Sam has already stated that he doesn't have the time to do this work (so that it doesn't take multiple passes over the data).

Until this happens, it might be best to go with Sam's original plan. Even if it means a loss of portability to other, non-conformant applications. The byte swapping option can be added at a later date, and could be transparent to existing applications using libtiff.

In either case, it is important to let the application decide how it wants the data. In my particular application, I'd choose to read unswapped bytes and amortize the cost of reordering as I create the display image.

Later,
-- 
Craig Hockenberry  |  Genasys II Pty. Ltd.
                   |  13th Level, 33 Berry St., North Sydney, NSW, Australia
           Bacco,  |
         Tobacco,  |  Phone:              +61-2-954-0022 (-9930 FAX)
        e Venere.  |  Internet:           craigh@g2syd.genasys.com.au