1
0
mirror of https://git.tartarus.org/simon/putty.git synced 2025-01-08 08:58:00 +00:00
putty-source/release.pl
Simon Tatham c19e7215dd Replace mkfiles.pl with a CMake build system.
This brings various concrete advantages over the previous system:

 - consistent support for out-of-tree builds on all platforms

 - more thorough support for Visual Studio IDE project files

 - support for Ninja-based builds, which is particularly useful on
   Windows where the alternative nmake has no parallel option

 - a really simple set of build instructions that work the same way on
   all the major platforms (look how much shorter README is!)

 - better decoupling of the project configuration from the toolchain
   configuration, so that my Windows cross-building doesn't need
   (much) special treatment in CMakeLists.txt

 - configure-time tests on Windows as well as Linux, so that a lot of
   ad-hoc #ifdefs second-guessing a particular feature's presence from
   the compiler version can now be replaced by tests of the feature
   itself

Also some longer-term software-engineering advantages:

 - other people have actually heard of CMake, so they'll be able to
   produce patches to the new build setup more easily

 - unlike the old mkfiles.pl, CMake is not my personal problem to
   maintain

 - most importantly, mkfiles.pl was just a horrible pile of
   unmaintainable cruft, which even I found it painful to make changes
   to or to use, and desperately needed throwing in the bin. I've
   already thrown away all the variants of it I had in other projects
   of mine, and was only delaying this one so we could make the 0.75
   release branch first.

This change comes with a noticeable build-level restructuring. The
previous Recipe worked by compiling every object file exactly once,
and then making each executable by linking a precisely specified
subset of the same object files. But in CMake, that's not the natural
way to work - if you write the obvious command that puts the same
source file into two executable targets, CMake generates a makefile
that compiles it once per target. That can be an advantage, because it
gives you the freedom to compile it differently in each case (e.g.
with a #define telling it which program it's part of). But in a
project that has many executable targets and had carefully contrived
to _never_ need to build any module more than once, all it does is
bloat the build time pointlessly!

To avoid slowing down the build by a large factor, I've put most of
the modules of the code base into a collection of static libraries
organised vaguely thematically (SSH, other backends, crypto, network,
...). That means all those modules can still be compiled just once
each, because once each library is built it's reused unchanged for all
the executable targets.

One upside of this library-based structure is that now I don't have to
manually specify exactly which objects go into which programs any more
- it's enough to specify which libraries are needed, and the linker
will figure out the fine detail automatically. So there's less
maintenance to do in CMakeLists.txt when the source code changes.

But that reorganisation also adds fragility, because of the trad Unix
linker semantics of walking along the library list once each, so that
cyclic references between your libraries will provoke link errors. The
current setup builds successfully, but I suspect it only just manages
it.

(In particular, I've found that MinGW is the most finicky on this
score of the Windows compilers I've tried building with. So I've
included a MinGW test build in the new-look Buildscr, because
otherwise I think there'd be a significant risk of introducing
MinGW-only build failures due to library search order, which wasn't a
risk in the previous library-free build organisation.)

In the longer term I hope to be able to reduce the risk of that, via
gradual reorganisation (in particular, breaking up too-monolithic
modules, to reduce the risk of knock-on references when you included a
module for function A and it also contains function B with an
unsatisfied dependency you didn't really need). Ideally I want to
reach a state in which the libraries all have sensibly described
purposes, a clearly documented (partial) order in which they're
permitted to depend on each other, and a specification of what stubs
you have to put where if you're leaving one of them out (e.g.
nocrypto) and what callbacks you have to define in your non-library
objects to satisfy dependencies from things low in the stack (e.g.
out_of_memory()).

One thing that's gone completely missing in this migration,
unfortunately, is the unfinished MacOS port linked against Quartz GTK.
That's because it turned out that I can't currently build it myself,
on my own Mac: my previous installation of GTK had bit-rotted as a
side effect of an Xcode upgrade, and I haven't yet been able to
persuade jhbuild to make me a new one. So I can't even build the MacOS
port with the _old_ makefiles, and hence, I have no way of checking
that the new ones also work. I hope to bring that port back to life at
some point, but I don't want it to block the rest of this change.
2021-04-17 13:53:02 +01:00

240 lines
9.0 KiB
Perl
Executable File

#!/usr/bin/perl
# Script to automate some easy-to-mess-up parts of the PuTTY release
# procedure.
use strict;
use warnings;
use Getopt::Long;
use File::Find;
use File::Temp qw/tempdir/;
use LWP::UserAgent;
my $version = undef;
my $setver = 0;
my $upload = 0;
my $precheck = 0;
my $postcheck = 0;
my $skip_ftp = 0;
GetOptions("version=s" => \$version,
"setver" => \$setver,
"upload" => \$upload,
"precheck" => \$precheck,
"postcheck" => \$postcheck,
"no-ftp" => \$skip_ftp)
or &usage();
# --setver: construct a local commit which updates the version
# number, and the command-line help transcripts in the docs.
if ($setver) {
defined $version or die "use --version";
0 == system "git", "diff-index", "--quiet", "--cached", "HEAD"
or die "index is dirty";
0 == system "git", "diff-files", "--quiet" or die "working tree is dirty";
my $builddir = tempdir(DIR => ".", CLEANUP => 1);
0 == system "git archive --format=tar HEAD | ( cd $builddir && tar xf - )"
or die;
0 == system "cd $builddir && cmake . -DCMAKE_C_FLAGS=-DRELEASE=${version}" or die;
0 == system "cd $builddir && cmake --build . -t pscp -t plink -j" or die;
our $pscp_transcript = `cd $builddir && ./pscp --help`;
$pscp_transcript =~ s/^Unidentified build/Release ${version}/m or die;
$pscp_transcript =~ s/^/\\c /mg;
our $plink_transcript = `cd $builddir && ./plink --help`;
$plink_transcript =~ s/^Unidentified build/Release ${version}/m or die;
$plink_transcript =~ s/^/\\c /mg;
&transform("LATEST.VER", sub { s/^\d+\.\d+$/$version/ });
our $transforming = 0;
&transform("doc/pscp.but", sub {
if (/^\\c.*>pscp$/) { $transforming = 1; $_ .= $pscp_transcript; }
elsif (!/^\\c/) { $transforming = 0; }
elsif ($transforming) { $_=""; }
});
$transforming = 0;
&transform("doc/plink.but", sub {
if (/^\\c.*>plink$/) { $transforming = 1; $_ .= $plink_transcript; }
elsif (!/^\\c/) { $transforming = 0; }
elsif ($transforming) { $_=""; }
});
&transform("Buildscr", sub {
s!^(set Epoch )\d+!$1 . sprintf "%d", time/86400 - 1000!e });
0 == system ("git", "commit", "-a", "-m",
"Update version number for ${version} release.") or die;
exit 0;
}
# --upload: upload the release to all the places it should live, and
# check all signatures and md5sums once it arrives there.
if ($upload) {
defined $version or die "use --version";
# Run this inside the build.out directory.
-d "maps" or die "no maps directory in cwd";
-d "putty" or die "no putty directory in cwd";
0 == system("rsync", "-av", "maps/",
"thyestes:src/putty-local/maps-$version")
or die "could not upload link maps";
for my $location (["thyestes", "www/putty/$version"],
["the", "www/putty/$version"],
["chiark", "ftp/putty-$version"]) {
my ($host, $path) = @$location;
0 == system("rsync", "-av", "putty/", "$host:$path")
or die "could not upload release to $host";
open my $pipe, "|-", "ssh", $host, "cd $path && sh";
print $pipe "set -e\n";
print $pipe "pwd\n";
find({ wanted => sub
{
if (m!^putty/(.*).gpg!) {
my $file = $1;
print $pipe "echo verifying $file\n";
if ($file =~ /sums$/) {
print $pipe "gpg --verify $file.gpg\n";
} else {
print $pipe "gpg --verify $file.gpg $file\n";
}
} elsif (m!^putty/(.*sum)s!) {
print $pipe "echo checking ${1}s\n";
print $pipe "grep -vF ' (installer version)' ${1}s | grep . | $1 -c\n";
}
}, no_chdir => 1}, "putty");
print $pipe "echo all verified ok\n";
close $pipe;
die "VERIFICATION FAILED on $host" if $? != 0;
}
print "Uploaded $version OK!\n";
exit 0;
}
# --precheck and --postcheck: attempt to download the release from its
# various web and FTP locations.
if ($precheck || $postcheck) {
defined $version or die "use --version";
# Run this inside the build.out directory, so we can check the
# downloaded files against the exact contents they should have.
-d "putty" or die "no putty directory in cwd";
my $httpprefix = "https://the.earth.li/~sgtatham/putty/";
my $ftpprefix = "ftp://ftp.chiark.greenend.org.uk/users/sgtatham/putty-";
# Go through all the files in build.out.
find({ wanted => sub
{
if (-f $_) {
die unless (m!^putty/(.*)$!);
my $path = $1;
# Don't try to check .htaccess - web servers will
# treat it weirdly.
return if $path =~ m!^(.*/)?.htaccess$!;
print "Checking $path\n";
my $real_content = "";
open my $fh, "<", $_ or die "$_: open local file: $!";
$real_content .= $_ while <$fh>;
close $fh;
my $http_numbered = "${httpprefix}$version/$path";
my $http_latest = "${httpprefix}latest/$path";
my $ftp_numbered = "${ftpprefix}$version/$path";
my $ftp_latest = "${ftpprefix}latest/$path";
my ($http_uri, $ftp_uri);
if ($precheck) {
# Before the 'latest' links/redirects update,
# we just download from explicitly version-
# numbered URLs.
$http_uri = $http_numbered;
$ftp_uri = $ftp_numbered;
}
if ($postcheck) {
# After 'latest' is updated, we're testing that
# the redirects work, so we download from the
# URLs with 'latest' in them.
$http_uri = $http_latest;
$ftp_uri = $ftp_latest;
}
# Now test-download the files themselves.
unless ($skip_ftp) {
my $ftpdata = `curl -s $ftp_uri`;
printf " got %d bytes via FTP", length $ftpdata;
die "FTP download for $ftp_uri did not match"
if $ftpdata ne $real_content;
print ", ok\n";
}
my $ua = LWP::UserAgent->new;
my $httpresponse = $ua->get($http_uri);
my $httpdata = $httpresponse->{_content};
printf " got %d bytes via HTTP", length $httpdata;
die "HTTP download for $http_uri did not match"
if $httpdata ne $real_content;
print ", ok\n";
# Check content types on any files likely to go
# wrong.
my $ct = $httpresponse->{_headers}->{"content-type"};
if (defined $ct) {
printf " got content-type %s", $ct;
} else {
printf " got no content-type";
}
my $right_ct = undef;
if ($path =~ m/\.(hlp|cnt|chm)$/) {
$right_ct = "application/octet-stream";
} elsif ($path =~ /\.gpg$/) {
$right_ct = "application/pgp-signature";
}
if (defined $right_ct) {
if ($ct ne $right_ct) {
die "content-type $ct should be $right_ct";
} else {
print ", ok\n";
}
} else {
print "\n";
}
if ($postcheck) {
# Finally, if we're testing the 'latest' URL,
# also check that the HTTP redirect header was
# present and correct.
my $redirected = $httpresponse->{_request}->{_uri};
printf " redirect -> %s\n", $redirected;
die "redirect header wrong for $http_uri"
if $redirected ne $http_numbered;
}
}
}, no_chdir => 1}, "putty");
print "Check OK\n";
exit 0;
}
&usage();
sub transform {
my ($filename, $proc) = @_;
my $file;
open $file, "<", $filename or die "$file: open for read: $!\n";
my $data = "";
while (<$file>) {
$proc->();
$data .= $_;
}
close $file;
open $file, ">", $filename or die "$file: open for write: $!\n";
print $file $data;
close $file or die "$file: close after write: $!\n";;
}
sub usage {
die "usage: release.pl --set-version=X.YZ\n";
}