AWARE SYSTEMS
TIFF and LibTiff Mail List Archive

Thread

2009.04.26 17:22 "[Tiff] Packbits worst case encoded length", by Simon Berger
2009.04.29 14:28 "Re: [Tiff] Packbits worst case encoded length", by Toby Thain
2009.04.30 07:41 "Re: [Tiff] Packbits worst case encoded length", by Albert Cahalan
2009.04.30 13:58 "Re: [Tiff] Packbits worst case encoded length", by Toby Thain
2009.04.30 19:12 "Re: [Tiff] Packbits worst case encoded length", by Albert Cahalan
2009.04.30 19:25 "Re: [Tiff] Packbits worst case encoded length", by Toby Thain
2009.04.30 19:31 "Re: [Tiff] Packbits worst case encoded length", by Albert Cahalan
2009.04.30 22:30 "Re: [Tiff] Packbits worst case encoded length", by Toby Thain
2009.05.01 06:34 "Re: [Tiff] Packbits worst case encoded length", by Albert Cahalan
2009.05.01 14:21 "Re: [Tiff] Packbits worst case encoded length", by Toby Thain
2009.04.30 19:25 "Re: [Tiff] Packbits worst case encoded length", by Simon Berger
2009.04.29 18:07 "Re: [Tiff] Packbits worst case encoded length", by Simon Berger
2009.04.29 19:17 "Re: [Tiff] Packbits worst case encoded length", by Toby Thain
2009.04.29 19:43 "Re: [Tiff] Packbits worst case encoded length", by Simon Berger
2009.05.05 19:13 "Re: [Tiff] Packbits worst case encoded length", by Albert Cahalan
2009.05.05 16:39 "Re: [Tiff] Packbits worst case encoded length", by Bob Friesenhahn
2009.05.05 18:09 "Re: [Tiff] Packbits worst case encoded length", by Albert Cahalan
2009.05.05 18:32 "Re: [Tiff] Packbits worst case encoded length", by Bob Friesenhahn
2009.05.05 18:42 "[Tiff] Guard pages - was Re: Packbits worst case encoded length", by Toby Thain
2009.05.05 22:57 "Re: [Tiff] Packbits worst case encoded length", by Graeme Gill

2009.05.05 22:57 "Re: [Tiff] Packbits worst case encoded length", by Graeme Gill

Yes and no. You're wasting address space, not the RAM itself. There won't be a big chunk of unused RAM because of this.

At worst, some systems will reserve some swap space.

In many systems this amounts to the same thing. Lots of 32 bit systems have 3-4Gig of memory now. It's really interesting to see the sorts of problems that crop up due to virtual address space exhaustion, when you actually try and use all that memory! (ie. address space fragmentation problems in the memory allocators).

I'm suggesting to use a different allocator for risky allocations. These would primarily be buffers that you decompress into.

Since this requires judgment on the programmers part, it would seem to map right back to the need to write perfect code.

I actually think it is possible to make a 100% bullet proof file parser, assuming that the format isn't stupidly intricate or performance sensitive, and that is by taking the approach of funneling all the file accesses and resulting memory allocations through one small set of routines that can reasonably be made bug free.

That won't prevent higher level attacks (ie. triggering memory problems by exploiting bugs in the code that interprets the contents of the parsed file), but it would prevent any exploits in the actual parsing of the file itself.

Graeme Gill.