AWARE SYSTEMS
TIFF and LibTiff Mail List Archive

Thread

1994.09.16 04:17 "TIFF Bit Ordering Versus Fill Order", by John M Davison
1994.09.16 17:42 "Re: TIFF Bit Ordering Versus Fill Order", by Sam Leffler
1994.09.19 06:55 "Re: TIFF Bit Ordering Versus Fill Order", by Karsten Spang
1994.09.19 15:26 "Re: TIFF Bit Ordering Versus Fill Order", by Sam Leffler
1994.09.19 11:48 "Re: TIFF Bit Ordering Versus Fill Order", by Niles Ritter
1994.09.19 22:48 "Re: TIFF Bit Ordering Versus Fill Order", by Sam Leffler
1994.09.20 00:03 "Re: TIFF Bit Ordering Versus Fill Order", by Joe Moss
1994.09.20 07:00 "Re: TIFF Bit Ordering Versus Fill Order", by Fredrik Lundh
1994.09.19 17:34 "Now that you mention bit order...", by Craig Jackson
1994.09.19 19:04 "Bit order revisited ...", by Scott Wagner
1994.09.19 22:46 "Re: Now that you mention bit order...", by Sam Leffler
1994.09.19 23:53 "RE: Now that you mention bit order...", by Craig Jackson
1994.09.20 13:25 "RE: Now that you mention bit order...", by Scott Wagner
1994.09.22 22:50 "Re: Now that you mention bit order...", by John M Davison
1994.09.22 23:49 "Re: Now that you mention bit order...", by Sam Leffler
1994.09.19 23:31 "Re: TIFF Bit Ordering Versus Fill Order", by Jim Arnold
1994.09.20 12:29 "MSB vs. LSB. (Was: TIFF Bit Ordering Versus Fill Order)", by Thomas Kinsman

1994.09.19 11:48 "Re: TIFF Bit Ordering Versus Fill Order", by Niles Ritter

Bit order and byte order are not necessarily related. I have experience working with devices that had/used byte order different from bit order.

What sort of devices?

I noticed that the libtiff code makes a comment to the effect, "How can bit-order be determined at runtime?" Are we talking here about the host CPU or an independent display device of some sort? If the latter, then it is hopeless for libtiff to try to determine "bit-order" at runtime.

Byte order is naturally defined by the addressing mechanism, but bits are not addressed directly. However, a possible way to define host bit order would be to use the C bit-field struct operations. As an experiment, I compiled and ran the following code on an SGI, HP, Sun, Macintosh, VAX/VMS, Ultrix and Alpha OS (Sorry, couldn't find anyone who uses PC's around here -- could someone run this on a PC and report back?):

#include <stdio.h>

struct HighLowBits {
     unsigned int highbit:  1;
     unsigned int lowbits:  7;
};

main()
{
   struct HighLowBits testbits;
   int one=1;
   int bigendian = (*(char *)&one == 0); /* byte-order from libtiff */

   printf("This is a %s machine\n",bigendian? "BigEndian":"LittleEndian");

   /* determine bit-order */
   testbits.highbit = 1;
   testbits.lowbits = 0;
   printf("Byte with highbit set has value %d\n",*(unsigned char *)&testbits);

}

On the Sun, HP, Macintosh and SGI the code returned:

   This is a BigEndian machine
   Byte with highbit set has value 128

While under VAX/VMS, Ultrix and Alpha the code returned:

   This is a LittleEndian machine
   Byte with highbit set has value 1

So, there does appear to be a strong correlation between bit and byte order. Are there any platforms on which this code returns any combination other than these two? Is this a compiler-implementation dependent approach? In any case, if bit-order is to have any practical meaning to programmers, then this would appear to be a way to objectively determine the bit-order at runtime; the simplest form of the test would be something like:

  struct HighLowBits {
     unsigned int highbit:  1;
     unsigned int lowbits:  7;
  };
  unsigned char char_one=1;
  int msb_to_lsb;
  msb_to_lsb = (*(struct HighLowBits *)&char_one).highbit==0;

I would be interested if anyone can come up with a non-equivalent alternative (programmer's-level) definition.

--Niles