2016.11.12 20:16 "[Tiff] Libtiff 4.0.7 release pending ...", by Bob Friesenhahn

2016.11.21 19:45 "Re: [Tiff] 12 bit byte format", by Aaron Boxer

TIFFTAG_BITSPERSAMPLE equals 12

TIFFTAG_SAMPLESPERPIXEL equals 3

I guess there is no standard format for 12 bit, in terms of bits on disk.

On Mon, Nov 21, 2016 at 2:43 PM, Bob Friesenhahn < bfriesen@simple.dallas.tx.us> wrote:

On Mon, 21 Nov 2016, Aaron Boxer wrote:

> On Mon, Nov 21, 2016 at 2:21 PM, Kemp Watson <kemp@objectivepathology.com>

wrote:

>> Do you have control of the original writing end of the data? If so, why >>> not store as 16-bit? Vastly easier…

Thanks. The data is stored in 16 bit, with tag indicating 12 bits used.
But, when I read the data with tifflib, I get the format I mentioned
earlier.
The strip is returned in 12 bit.

If the data is stored in 16-bits but the depth is indicated as 12-bits, then that would be a pretty serious encoding error.

What "tag" is indicating that 12-bits are used?

Does GraphicsMagick properly read this file (12 bits in 16-bit samples would be dim, but not scrambled)?