libpqxx:generate-config

Last commit made on 2023-11-03
Get this branch:
git clone -b generate-config https://git.launchpad.net/libpqxx

Branch merges

Branch information

Name:
generate-config
Repository:
lp:libpqxx

Recent commits

ed1a239... by Jeroen T. Vermeulen

Add `$CPPFLAGS` to preprocessor command line.

7c6407c... by Jeroen T. Vermeulen

Delete old C++ features header.

9a7a996... by Jeroen T. Vermeulen

Switch to new header.

599fd9d... by Jeroen T. Vermeulen

Install generated C++ features header.

36ccbcf... by Jeroen T. Vermeulen

Switch to the preprocessor approach.

133494e... by Jeroen T. Vermeulen

Trying to fix small things.

9f0d438... by Jeroen T. Vermeulen

Support older python.

4aea35c... by Jeroen T. Vermeulen

Unquote another argument.

31946a1... by Jeroen T. Vermeulen

Quote an argument.

fbb44b2... by Jeroen T. Vermeulen

Feature-test macros + config header!

We're getting bug reports (e.g. #732) for situations where people have a
libpqxx compiled as C++17 but an application compiled as C++20, or _vice
versa._

Generally it's probably not wise to link C++17-compiled code to code
compiled as C++20. But two things I did exacerbate the problem:
1. I moved a bunch of C++ feature checks to compile time, using C++20's
   feature test macros. It looked like a much cleaner, easier way
   forward with reduced maintenance overhead.
2. Usage of C++20's `std::source_location` _when present_ affects the
   ABI. So effectively there's one ABI with, and one without. I see
   that mostly as the price of doing libraries in C++ — it's generally
   dangerous to package library binaries, unless they've been designed
   to be compatible, or they come as a range of binaries for various
   compilers, versions, and configurations.

And the real problem is that _these two changes interacted._ The
detection of support for `std::source_location` happened at compile
time. And so if you compile libpqxx in C++17 (without this feature)
and the application in C++20 (with this feature), the two will not be
link-compatible!

In this first commit I'm prototyping a new approach that I hope will
combine the ease of maintenance of feature test macros with the ABI
stability of a configuration header. Configuration speed should lie
somewhere inbetween: no more compiling separate little checks for
every individual C++ feature.

But it's not easy. There are 2 orthogonal binary choices, leaving me
with 4 scenarios to support:
* Autoconf vs. CMake
* Cross-compiled vs. native.

How does cross compilation factor into it? It works like this: I need
to check C++ feature test macros. I check them in C++. But I don't
want to keep editing that C++ code for every feature — there's a
repetitive preprocessor dance that I don't think I could simplify to one
simple line of code, because I'd need to pass macro names as parameters
to macros. So, I write a minimal config file and run it through a
Python script that generates the C++ code for me. Then I have the build
configuration machinery compile _and run_ that C++ code, and generate
the configuration header.

Yes, alright, but how does cross compilation factor into it? First, if
you're cross-compiling, it's not a given that you can _run_ the binary
you produce on the same system! The whole point is that the two systems
are different. And two, you'll have a native compilation environment
but there's no guarantee that it will resemble the cross compilation
environment at all. So if you compile a binary to run locally, you may
get very different results.

So for cross-compilation, the Python script just generates a minimal
configuration header that just has all features disabled. And in this
first commit I've got that working for autoconf. But I'm still
struggling with CMake (thanks @KayEss for helping with this).

If it gets too difficult, I may go a different route: generate C++ code
from Python, but only run it through the preprocessor. (I believe
autoconf has a standard, portable way of running the preprocessor; let's
hope CMake has one as well.) The output will still C++ code, but now
it's been preprocessed, so hopefully it'll be possible to tell portably
which features are present. And ironically, I think I'd then have to
have another Python script to _postprocess_ the preprocessed code and
turn it into a ready-to-use config header.