DISCLAIMER: This site is a mirror of original one that was once available at http://iki.fi/~tuomov/b/
Ages ago, it was fun to just play with the computer. These days I'd rather use them just as tools towards other ends. Unfortunately, most of the end-user software and the hardware is so crappy and constantly broken – I wager that only cars, which I detest even more than the former, come even close to this level of crappiness – that I have to write my own. As there are so many different kinds of software in the world, and most of it bad, there obviously isn't time to replace all of it, and this not-a-blog serves as an outlet for those ideas I don't see myself implementing in the near future (i.e. most of them). Regarding actually implementing some of those ideas, while there are some rather nice languages and other tolerable ones – languages, after all, being more a matter of computer science that has very little to do with real-world computers per se – the actual build tools (and libraries) are crap. If I have to spend time programming something, I'd rather concentrate on the programming problem than fighting with the idiosyncracies of bad tools.
The GNU linker.
Its dependency handling is a totally fucked up mess. First of all, it's
really picky about the order of files given on the command line, and can't
find shit if things aren't very meticulously ordered how it wants them.
Secondly, it's very difficult to make it use the version of a (shared)
library I want to use. In the option
-lfoo it is obviously not possible to
describe the version to use, but also, for example,
/usr/lib/libfoo.so.1.2.3 will not link against
libfoo.so.1. Same with attempting to link directly against
/usr/lib/libfoo.so. It's trying to be too smart, and thus does the wrong
thing. While there may exist some semi-documented option to select a
particular version, such complexity should not be necessary. Not that this
library versioning insanity is solely the linker's fault.
Make. Makes life laboursome. The same mess of broken rules gets
rewritten in every single program. At least it's not trying to be too smart,
but Makefile maintenance is a bit too laboursome with bigger programs, and
make itself quite slow to traverse the directory hierarchies.
make one also has to deal directly with all the lower-level
compilation and linking tools that have their own flaws, as described
Where do I begin? Autoconf is a solution waiting for a problem…
creating a problem. Instead of some simple standard scheme having been
devised, by which libraries could report their configuration (such as
pkg-config), and a small library having been written to handle some
common incompatibilities between different unices and architechtures,
every program is now supposed to contain the same broken scripts to check
for a zillion possible ways a library could have been installed on the system,
and for the same architechture-dependencies. The unix modularity principle
has been completely lost in the autoconf madness. Furthermore, all the
checking and script-running takes a lot of time – often longer than the
actual compile – and creates many points of failure. Even complex makefiles
are relatively easy to fix compared to the cryptic autoconf scripts that
stop even a part from being built, if they fail – which they often do.
Now, not necessarily every incompatibility and system-dependency can –
or even should – be handled by a standardisation library, but the vast
majority of cases are best handled that way, even if the library is just
a set of
#defines. For the other cases, the build system should support
some checking routines. However, the build process should not depend on
these scripts being able to run, and produce the actual building rules.
These checking routines should be part of the actual "make" system, and
all of them easily replaceable by manually-defined values.
Libtool. I used it once with Ion, but got fed up with it quite quickly for various reasons, the last drop being its inability to deal with a dynamically loaded module depending on each other, without awful hacks and forcing everything into some model of how it thinks library files should be installed (and named? – my memory is vague here). Even in general it seemed to rely too much on the lowest common denominator of different architechtures. This uniformisation is the wrong approach to take in a compatibility library. A good compatibility library should facilitate taking advantage of the peculiarities of each target system, in some cases by providing enough abstractions, and in other cases, by providing the uniform approach, but letting one choose a subset of systems to target, if one so desires.
Something good to say for a change here.
ghc --make Main.hs, and that's it. It can trace the other files
needed to compile from the imports of Main.hs. That's the way build
systems should work. Unfortunately,
ghc --make can only handle pure
Haskell programs, and not ones that need to hack into C with FFI
and such things that many "real-world" programs need to do. Thus Makefiles
are needed to do that part. Also, the ghc package system is a total mess
that is constantly broken (at least on Debian/testing… but then again,
that must be the most broken distro on earth; even "unstable" often
seems to be less broken).
Scons. In some aspects this seems like an improvement over make, and in some aspects it is much worse. For, at least make used to have a simple rule-based syntax, although this was later extended with various programming and preprocessor constructs. Scons scripts, on the other hand, are exactly that, Python scripts, meaning they're written in a Turing-complete language. That's a big mistake, in my opinion. The build system of a program or a project should be a description of the project, a configuration file that can be understood by a wide variety of tools, and using a Turing-complete language doesn't lend itself to that. (Yes, I am also guilty as charged of using a scripting language for program configuration, but I have my sorry excuses that may be elaborated in another post.)
One nice thing about scons is that it provides pre-defined "builders"
for building programs, libraries and so on. Unfortunately, these
pre-defined builders also seem to have a fixation on the
.a files, which is not necessary or even
appropriate of dynamically loaded program modules. Also, it doesn't
extend these builders to configuration and external library dependencies,
instead replacing the GNU autoconf M4 mess with a Python mess.
On one hand, a rather nice-seeming design decision of scons is
that installation targets are targets just like any other files, with
In the other hand, regarding how the building of these targets is triggered,
well, I can't help but wonder what it is that they were smoking when
designing the mechanism. You're either supposed to trigger these rules
scons $PREFIX or create an alias,
scons install. This is a completely unnecessary hack
that adds a lot of complication if not everything should go under
$PREFIX. Why can't e.g.
scons --install simply cause all the
targets (or anything that isn't contained in the working directory)
to be set as targets to build, when they normally wouldn't? Why this
complex play with paths as targets?
A better build system. What would a better build system look like then?
-Wl,-whole-archivehack), or this can ease the support of various different compilers.
pkg-config, unless the information has been manually provided.
An example can be more instructive than a bunch of words, so here's one,
without all the tricky details worked out. (Conditional builds could
perhaps take the form
[builder when condition], if sticking to this
.INI-style format, which I rather like for its robustness.)
[environment] PREFIX = /usr [external-library] name = bar version = 2.1 # lib = # headers = [check] # This should infact be standard! name = int-at-least-32bit script = build-scripts/checksize int 32 # value = yes [program] target = foo check = int-at-least-32bit libraries = bar c_sources = foo.c [install] target = $PREFIX/bin/foo source = foo