cmake doesn't have convincing facilities for doing this in its install
step, so the new advice is to do it manually (we've provided no
equivalent to the autotools --enable-setuid or --enable-setgid options,
nor UTMP_USER/GROUP).
This brings various concrete advantages over the previous system:
- consistent support for out-of-tree builds on all platforms
- more thorough support for Visual Studio IDE project files
- support for Ninja-based builds, which is particularly useful on
Windows where the alternative nmake has no parallel option
- a really simple set of build instructions that work the same way on
all the major platforms (look how much shorter README is!)
- better decoupling of the project configuration from the toolchain
configuration, so that my Windows cross-building doesn't need
(much) special treatment in CMakeLists.txt
- configure-time tests on Windows as well as Linux, so that a lot of
ad-hoc #ifdefs second-guessing a particular feature's presence from
the compiler version can now be replaced by tests of the feature
itself
Also some longer-term software-engineering advantages:
- other people have actually heard of CMake, so they'll be able to
produce patches to the new build setup more easily
- unlike the old mkfiles.pl, CMake is not my personal problem to
maintain
- most importantly, mkfiles.pl was just a horrible pile of
unmaintainable cruft, which even I found it painful to make changes
to or to use, and desperately needed throwing in the bin. I've
already thrown away all the variants of it I had in other projects
of mine, and was only delaying this one so we could make the 0.75
release branch first.
This change comes with a noticeable build-level restructuring. The
previous Recipe worked by compiling every object file exactly once,
and then making each executable by linking a precisely specified
subset of the same object files. But in CMake, that's not the natural
way to work - if you write the obvious command that puts the same
source file into two executable targets, CMake generates a makefile
that compiles it once per target. That can be an advantage, because it
gives you the freedom to compile it differently in each case (e.g.
with a #define telling it which program it's part of). But in a
project that has many executable targets and had carefully contrived
to _never_ need to build any module more than once, all it does is
bloat the build time pointlessly!
To avoid slowing down the build by a large factor, I've put most of
the modules of the code base into a collection of static libraries
organised vaguely thematically (SSH, other backends, crypto, network,
...). That means all those modules can still be compiled just once
each, because once each library is built it's reused unchanged for all
the executable targets.
One upside of this library-based structure is that now I don't have to
manually specify exactly which objects go into which programs any more
- it's enough to specify which libraries are needed, and the linker
will figure out the fine detail automatically. So there's less
maintenance to do in CMakeLists.txt when the source code changes.
But that reorganisation also adds fragility, because of the trad Unix
linker semantics of walking along the library list once each, so that
cyclic references between your libraries will provoke link errors. The
current setup builds successfully, but I suspect it only just manages
it.
(In particular, I've found that MinGW is the most finicky on this
score of the Windows compilers I've tried building with. So I've
included a MinGW test build in the new-look Buildscr, because
otherwise I think there'd be a significant risk of introducing
MinGW-only build failures due to library search order, which wasn't a
risk in the previous library-free build organisation.)
In the longer term I hope to be able to reduce the risk of that, via
gradual reorganisation (in particular, breaking up too-monolithic
modules, to reduce the risk of knock-on references when you included a
module for function A and it also contains function B with an
unsatisfied dependency you didn't really need). Ideally I want to
reach a state in which the libraries all have sensibly described
purposes, a clearly documented (partial) order in which they're
permitted to depend on each other, and a specification of what stubs
you have to put where if you're leaving one of them out (e.g.
nocrypto) and what callbacks you have to define in your non-library
objects to satisfy dependencies from things low in the stack (e.g.
out_of_memory()).
One thing that's gone completely missing in this migration,
unfortunately, is the unfinished MacOS port linked against Quartz GTK.
That's because it turned out that I can't currently build it myself,
on my own Mac: my previous installation of GTK had bit-rotted as a
side effect of an Xcode upgrade, and I haven't yet been able to
persuade jhbuild to make me a new one. So I can't even build the MacOS
port with the _old_ makefiles, and hence, I have no way of checking
that the new ones also work. I hope to bring that port back to life at
some point, but I don't want it to block the rest of this change.
After a conversation this week with a user who tried to use it, it's
clear that Borland C can't build the up-to-date PuTTY without having
to make too many compromises of functionality (unsupported API
details, no 'long long' type), even above the issues that could be
worked round with extra porting ifdefs.
(In a XXX-REVIEW-BEFORE-RELEASE form.)
Also, note the effect of compilation with different Visual Studio
versions on Windows version compatibility in the source README, for the
sake of having it written down somewhere.
I've reset the baseline to be the version of mingw-w64 that comes with
Ubuntu 14.04. Right now, that means no features need to be omitted; all
you need to do is set TOOLPATH to i686-w64-mingw32- .
I've removed -mno-cygwin without comment. Toolchains which don't support
this flag have been around since at least 2012, so we can probably
assume that no-one cares about older toolchains by now.
It's really only useful with MinGW rather than a Cygwin toolchain these
days, as recent versions of the latter insist against linking with the
Cygwin DLL.
(I think it may no longer be possible to build with Cygwin out of the
box at all these days, but I'm not going to say so without having
actually checked that's the case. Settle for listing MinGW first in
various comments and docs.)
the real configure script from the unix subdirectory, but with cwd
unchanged so that you end up doing a VPATH build in the top-level
source directory.
Should, ideally, placate the people who expect 'configure' to be at
the top level, while still letting _me_ keep all the Unix-specific
stuff in the Unix subdirectory.
[originally from svn r9241]
--without-gtk as a means of manually overriding the makefile into one
building the command-line tools only (as it would if GTK were not
found at all at configure time).
[originally from svn r9240]
mkfiles.pl no longer generates a Makefile.in, but instead generates a
Makefile.am on which mkauto.sh runs automake. This means that the
autoconfigured makefile now does build-time dependency tracking (a
standard feature of automake-generated makefiles), and is generally
more like what Unix people will expect.
Some of the old-style make command-line settings (VER=-DRELEASE=foo,
XFLAGS=-DDEBUG) will still work; the COMPAT settings are better done
by autoconfiguration, and my habitual 'XFLAGS="-g -O0"' for an easily
debuggable build will actually not work any more because CFLAGS is
specified _after_ XFLAGS, so I should instead write 'make CFLAGS=-O0'
(-g is the default in automake, removed at 'make install' time).
The new makefile will automatically degrade into one that builds the
command-line tools only, in the case where GTK could not be found. In
principle, therefore, it should be an adequate replacement for _both_
the static Unix makefiles, Makefile.gtk and Makefile.ux. I haven't
actually retired those in this commit, but I'm pretty tempted.
[originally from svn r9239]
rather than relying on the user to edit the Makefile. Makefile.gtk
still works as well as it ever did, but now we get a Makefile.in alongside
it. mkunxarc.sh now relies on autoconf and friends to build the configure
script for the Unix source distribution.
[originally from svn r5673]
long last to move all the Windows-specific source files down into a
`windows' subdirectory. Only platform-specific files remain at the
top level. With any luck this will act as a hint to anyone still
contemplating sending us a Windows-centric patch...
[originally from svn r4792]
files as well as an nmake makefile. Needed line-end tweakery in
order to be able to generate usable project files when run on Unix,
but other than that appears fine. Ooh!
[originally from svn r3721]
analysis (for both .c and .rc files). Generates the VC++ makefile as
well as the other two; the authoritative source is now the new file
`Recipe' rather than any particular Makefile. Note that `Makefile'
is still here as a relic of the old way until we stop the nightly
builds using it, but it'll be gone soon.
[originally from svn r1592]