2017.06.28 02:26 "Re: [Tiff] Excessive memory allocation while chopping strips", by Olivier Paquet
2017-06-27 18:52 GMT-04:00 Even Rouault <email@example.com>:
Fuzzers (specifically the address sanitizer) crash on this situations, assuming that attempts of huge memory allocations are a bug. libtiff itself will not crash (or it is a real bug) if a malloc() fails, but this isn't a good practice to attempt allocating a lot of memory if it is not needed. And indeed the files processed by fuzzers are just a few kilobytes large at most.
I think we should be careful not to break libtiff because of a broken tool then. It's great if we can reduce the huge allocations but it seems to me that because of the nature of the TIFF format, they will be hard to eliminate completely without breaking some valid files or writing a lot of complex code (read: code with new, unknown bugs). Just something to keep in mind.
With that said, I don't mind limiting the patch to split huge strips. I don't think we're helping anyone by applying it to files millions of pixels high. However, won't it just push the potential allocation problem to strip size? Or are there other checks there?
An API to set an allocation limit is a little tricky as TIFFOpen() will read the first IFD unless you use 'h' and that could contain large tags. But otherwise, it looks like something which might be useful to someone required to deal with unsafe input.