2004.01.14 12:01 "[Tiff] COLORMAP and byte padding", by Stephan Assmus

2004.01.14 12:01 "[Tiff] COLORMAP and byte padding", by Stephan Assmus

Hello,

I'm adding write support to the libtiff based OpenBeOS TIFF Translator. I want to support palette images. The format of the colormap that

TIFFSetField(tif, TIFFTAG_COLORMAP, ???);

expects is unclear to me. In the documentation, it just says

Tag Name           Count  Type      Notes

TIFFTAG_COLORMAP   3      uint16*   1<<BitsPerSample arrays

Ok, so there are 256 entries in the palette if I have 8 bits per sample. So much I follow, but what's with the uint16*?!? And why Count == 3? The BeOS colormap bitmaps all use the same system wide palette, which consists of 256 entries each representing an rgb color with 3 (4 actually) 8 bit values for r, g, b (and alpha). Am I supposed to fit a 24 bit RGB value into 16 bits and have 256 uint16s? Or do I have 3 * 256 uint16s with 16 bits for each r, g and b?

I have also a couple more questions about strip size and byte padding. There seem to be two alternating fields that determine strip size:

TIFFTAG_ROWSPERSTRIP       rows per strip of data
TIFFTAG_STRIPBYTECOUNTS    bytes counts for strips

Can both be used at the same time to tell libtiff that

strip byte counts != rows per strip * samples per pixel * width?

(So that there are some padding bytes at the end of each row...)

Best regards,
-Stephan