2009.06.15 16:57 "Re: [Tiff] Tiff 16 bit and 32 bit images", by Chris Cox
For most things, bit depth and color mode (interpretation) are independent.
You can have 32 bit float RGB, Grayscale, LAB, etc. images.
You never count the total number of bits. Only bits per channel. This is important because the channel count is not limited by the color mode, and because counting the total number of bits leads to confusion (how many different ways could you pack values into 64 bits?).
16 bit RGB means 16 bits of Red, 16 bits of Green, 16 bits of Blue.
From: firstname.lastname@example.org [email@example.com] On Behalf Of Keshab Neupane [firstname.lastname@example.org]
I am very beginner in this tiff field, If I say something silly, please excuse me.
I am writing an application to convert/save tiff image into buffer and then back to image from buffer.
For this, I need to handle all types of images. I am having trouble with tiff 16 bits signed/unsigned integer and 32 bit float images. Does this 16/32 bit means they are colored images or they can be greyscale too? I am using TIFFReadScanline to read the datas, but the problem is my buffer(this is constraint) is char* and how can I access this data if they are 16/32 bit? And also, how does this data relate to bpp,bps and spp? I mean can 16 bit means 16 bit for R,G,B each or 16 bit in total for R,G,B; and same with 32 bit?
Thank you very much for your help.