DISCLAIMER: This site is a mirror of original one that was once available at http://iki.fi/~tuomov/b/


1. It's been two years since I wrote “The case against blurred fonts”. What has happened in the meanwhile?

Microsoft has joined the blur fascist front headed by the FOSS herd. In Vista, it is no longer possible to disable blurry fonts completely – at least not in a straightforward manner. The same option as in XP, only disables them in some components, but not all. Some of the default fonts also do not rasterise well, seeming to lack hinting. MacOS X, on the other hand, has been blur fascist since its conception. While there exist third party tools (TinkerTool) to disable blurring, from what I've gathered (for I've never actually used MacOS X), it also is seemingly missing the bytecode interpreter.

I've noticed some cell phones to have blurred fonts now; I don't know if they can be disabled. Probably no: monocultures are en vogue. I fear the day when my current phone finally dies. (I've had it for 3.5 years so far, with no plans of replacing until necessary. The battery is no longer in good condition.)

The only good thing to happen during these two years time, to my knowledge, is that Debian now includes the bytecode interpreter. Now you don't have to compile Xft yourself, if you happen to be running Debian. But it is of no help on the more commercially-oriented distributions that you are more likely to find on a random computer: they do not include the bytecode interpreter.

But the bytecode interpreter and manual hinting is, in present light, absolutely essential for good – i.e. unblurred – rasterisation on devices with such poor resolutions as computer screens have. The freetype autohinter is nothing but a joke in case of unblurred fonts. (With blurred fonts the results don't differ that much from the bytecode interpreter – they're bad in either case.) But, in fact, there actually seems to be a trend that fonts no longer come with hinting. The only commonly available good TTF fonts pretty much seem to be those in the msttcorefonts package, which isn't many (and some of those are bad too).

2. Even given the bytecode interpreter and well-hinted fonts, it is still an enormous amount of work to disable blurring on fascist operating systems. It seems practically impossible on all the latest mainstream operating systems: third party hacks on MacOS X (and still you won't get good fonts), I don't know if it's even doable on Vista, and XML programming on Xft/fontconfig-based software – which most new programs tend to be on *nix.

Yes, the fontconfig XML “configuration files” are really source code, so complex and cluttered they are. And the setup is simply brain-damaged: you can't just create a long and complete ~/.fonts.conf once, and copy it over to another system (and even if you could, it's still too much work), because it does not override the system settings: the system settings load it in a location dependent on the distribution in question, and you have to write code to override code from the system files, if it is doable at all, depending on the setting. That's the FOSS attitude: “use the source, we can't be arsed to do anything properly”. You could just as well ask people to crack closed-source software.

And what does this code you have to write need to do. To turn on the bytecode interpreter; disable blurring; disable anti-bitmap and anti-Helvetica blocks; replace FuglyStream Vera by a better font as the default (such as the bitmap Helvetica). All this is spread all over the place, and consists of a hundred lines of illegible XML (in a large number of files on systems that use a conf.d mess instead of a single file). It's not practical to do it all in ~/.fonts.conf, so you need root permissions of all things. Such a requirement is a fundamental design mistake for basic configuration that should be mostly user- instead of system-specific. What is more, you have to do this again on every different system you come across, because they differ.

3. It seems to be, that most of the features of fontconfig's over-verbosity fad compliant (i.e. XML-based) font-selection language are needed and meant for font authors to tune the system to their needs. Users typically should only need a few options, such as those listed above (and for those with the popular blur fetish, options to control blurring threshold, plus to change subpixel orientation, to change the shade of the rainbow in their fonts). This could be done through a very simple .INI-based configuration system (or rather an abstract path-based configuration registry, that can be chosen by the user to use the nice .INI format). Perhaps scripts in the existing XML programming language of fontconfig could even use variables obtained this way, to tune its behaviour.

But I guess the authors of these pieces of crapware think that people who do not want to spend time enough to become essentially developers themselves, should only use such cheap plastic clones of Windows and OS X as Gnome or KDE are, that include their own font configuration interfaces. Such is the trend: power users are a dying breed, and left are only developers and idiot users at the mercy of their ever more complex systems (intended to increase their power and keep their jobs).

4. Returning to fonts in Vista, often where it insists on blurring, the font is in italics, which indeed typically looks very bad without blurring. But I'd rather then not use italics at all, or use some very lightly slanted font with good hinting. The thing with present screen resolutions is that you can't assume there are no pixels, as the current religion is. Screens do not have “virtually infinite” resolution like printers already do. Even the poorest laser printers have 600dpi resolution, and mid-range printers 1200dpi. Typical screens are less than 100dpi; some laptops – and the above-mentioned cell phones – slightly more, but not much and certainly not enough to not see the blurring. Some say 200dpi screens could easily be produced if there was demand. But 200dpi is not enough for blurring not to show. I can see the dots in 300dpi prints. 600dpi… maybe. We're far from that. Screen resolutions are awful. You have to embrace the pixels, not act as if they are not there. But the AA fascist movement wants us to suffer from blurred fonts while waiting for such resolutions at the end of the rainbow, as ideologues typically ask us to do.

The word “unreal” perhaps best describes blurred fonts when mixed with otherwise crisp graphics or flat-coloured backgrounds. They just look out-of-place, especially at small scales, where the proportion of grey to black pixels is huge. Blurring isn't all that bad with very big font sizes, say 50px or more high. At such sizes you actually get noticeable pixelisation without blurring (except with a very blocky retro font that could be fully conveyed with much less pixels). This is in contrast to the situation at small sizes, where you can actually use the pixels to your advantage (with hinting or bitmapped fonts). But for a consistent look I prefer to not blur big fonts either. At those sizes the choice is between two different annoying effects, whereas at the “pixel-scale” fonts designed to take advantage of (instead of working against) those pixels are clearly better and cleaner.

By the way, that 50px height, is about 4 times the height of my current font (14px), so on a 400dpi display, would be the regular font size. So at such resolutions, although the rasterisation or blurring might still be visible, it should make no difference which one it is: blurring has defeated itself. Thus it becomes pointless, for it takes extra computing cycles compared to plain rasterisation. (But, of course, the normal way of doing things is to build a new power plant and still keep on wasting cycles for blurring, even when it makes no difference.)

On the other hand, when the screen background is noticeably smoothly irregular, blending the font into the background with anti-aliasing is clearly better than a crisp font, that will look out of place, unreal. (Notice I didn't say “blurring”: in this case “anti-aliasing” in the proper signal processing sense often applies. The whole image with the text a part of, can be considered anti-aliased.) But noticiable irregularity is not the case in most usable application backgrounds, which are essentially flatly coloured.

5. But the general trend in computing seems to be that married to the tyranny of statistics, OS designers know better, and therefore personal choice is made difficult, and systems complex and bureaucratic. You're expected to sacrifice the present on the altar of the future, and to sacrifice personal ergonomics on the altar of fads and FOSS high priests.

DOS as a simple and manageable system was the high point of computing. XP was a local maximum: the last OS with good fonts, and not even nearly as bad as W95, around the release of which I switched from DOS to what can now clearly be seen as the failure known as Linux/FOSS.

Years ago, you constantly heard from the blur-fascist crowd how fonts suck in Linux. Now they've had their will, and fonts indeed do suck in Linux. The “ooh, new and shiny and thus good!” herd just keeps on destroying the computing experience (among other things).

Time to pick up the hoe and axe.