2006.12.06 20:56 "[Tiff] Re: Grayscale, or is it?", by Joris
I'm forever having trouble with grayscale in TIFF, except that now I really need to resolve it and make a final descision.
And so I have. I choose option 1).
In summary, I think we all agree that RGB, in a TIFF file, not accompanied by any ICC profile or other mechanism for additional colourspace definition, is best interpreted as sRGB. This means, it has a gamma of 2.2, it corresponds closely to the 'average monitor' out there and thus equals what most people mean when they refer to 'device RGB'. So for most of us there is no further conversion required when reading or writing it.
To be absolutely clear, if you don't know what exactly is 'sRGB' and what exact gamma has the RGB you've been playing with all this time, then it is safe to assume that all RGB you've seen is sRGB, has this gamma of 2.2 which you're perfectly free to totally ignore, and is exactly the same as what you find in a TIFF file.
While the spec has this little remark I quoted that seems to say otherwise, I think most of us agree that grayscale in TIFF is de facto the R=G=B subset of the above mentioned sRGB. Thus, this grayscale, too, has this gamma of 2.2. You can convert from TIFF grayscale, to normal ordinary sRGB, easilly, by setting all three channels R, G, and B to this grayscale value.
To be absolutely clear, if you don't know what exactly is the grayscale you've been playing with all this time, then it is safe to assume all grayscale you've seen is this subset of sRGB, has this gamma of 2.2 which you're perfectly free to totally ignore, and is exactly the same as what you find in a TIFF file.
This conclusion is consistent with LibTiff RGBA interface implementation.
Joris Van Damme
Download your free TIFF tag viewer for windows here: