2022.06.04 16:54 "[Tiff] Upgrade of CI image", by Roger

2022.06.05 11:11 "Re: [Tiff] Upgrade of CI image", by Roger

Hi Greg,

These are all good points. As you said, it all comes down to supporting what is documented. Unfortunately, that isn't too clear, and having a reasonable set of requirements properly agreed on and documented would make the scope for CI testing clearer.

If you look at the CI testing (example: https://gitlab.com/libtiff/libtiff/-/pipelines/555776687), you'll see we have a reasonable coverage of various platforms. We test Cygwin, MinGW and VS2019 on Windows through AppVeyor. And we test GCC on Ubuntu Linux using GitLab directly (and until last week, LLVM/clang on FreeBSD arm64 which I had to offline due to the storage being knackered). Mac OS X has no CI coverage at present (we can add it now GitLab support it but we need to register for it). We could support a wider range of MSVC/Visual Studio versions. We don't yet test on VS2022. We could test on VS2017 or earlier. But every addition to the build matrix does have a maintenance burden as well as slowing things down. The same applies to testing multiple Linux distributions and distribution versions. Where do we draw the line? What would be "good enough"? On top of that we test both CMake and Autotools and if we were truly comprehensive the combinatorial explosion would be huge. So I'm not averse to extending the coverage at all, but I do want to keep it manageable.

One point to consider is who our primary consumers/customers are. I'd posit our biggest users, by a large margin, are using pre-packaged libtiff builds provided by package managers, and that ensuring they are getting a well-tested and usable library release has the biggest cost:benefit for our end users and developers. This includes all of the Linux package managers (apt/yum/dnf/gentoo/arch), MacOS (brew and MacPorts), FreeBSD and other BSD ports trees, and more recently vcpkg on Windows.

The main thread with all of these is that they are either rolling releases building against a recent toolchain, or they are built against a recent toolchain and then frozen for the lifetime of a distribution's major release. What we don't have is building of a new libtiff release on an older platform (for the most part). It ends up causing ABI breaks on LTS releases when it conflicts with the system copy. Look at pylibtiff for an example of that. So testing on current platforms is most beneficial for ensuring that all of these consumers are getting a working release.

The other use case would be direct embedding of libtiff as a third-party library within a larger project, or hand-building libtiff and using it within other projects. This is common in proprietary projects. This is more difficult to support since we're both unpaid volunteers and we have no insight into what people are actually using.

I'm totally on board with adding e.g. Ubuntu 20.04 back and maybe back to 18.04 as well. However, we do need to be somewhat restrained in how many jobs we have in total, and which build system we test for each platform. I would *love* to have some data on which GCC versions people are using, so that we have some evidence to justify what we are committing to support, rather than blindly supporting old systems "just because". If our primary userbase isn't even using new versions of libtiff on older systems, the cost:benefit would be low.

Once the new sphinx docs (rst-docs branch) are merged, I would be very happy if we were to add a "supported platforms" page to the docs which details this properly. It would be great to gather comments. I created https://gitlab.com/libtiff/libtiff/-/issues/430 for this, so if anyone wants to contribute any data, please feel free.

Kind regards,

Roger