DISCLAIMER: This site is a mirror of original one that was once available at http://iki.fi/~tuomov/b/


Some days ago I wanted to transfer a few files to a friend. The usual way of doing that in my circles is netcat. However, there were some problem with the transfer and netcat, obviously, doesn't support resume – and I'm not particularly fond of the potentially hazardous tar cf piping either – so I started looking for an as simple alternative.

DCC isn't such a simple option, because people tend to run IRC in screens on other systems, and ssh and ftp accounts are not suitable for only occasional transfers. I sought for a dummy-ftpd that could run as user with no static configuration at all: port to listen to, directory to chroot to (defaulting to the current directory) and the only available username and password given on the command line. I couldn't find one. I tried torrents, but they're rather complicated too (and e.g. Azureus's tracker page didn't work – and I'd anyway rather not use crap that uses both java and, especially, gtk).

So I started thinking how these things were done in the good old days, besides swapping diskettes. You'd dial up a friend's computer and initiate a transfer using one of the modem file transfer protocols. Some of them supported resume and everything. Now, there's an idea! Could one pipe ZMODEM, for example, through bi-directional netcat? man sz. "Hey, what's this? --tcp-server? --tcp-client?" The lrzsz implementation already supports TCP transfers. Very nice. There's the simple file transfer program I wanted.

The title of this story, "vanhassa vara parempi", is a Finnish saying and roughly means "old stuff is more trustworthy". The story above is one example of it, at least subjectively. FTP may be older than particular modem transfer protocols, but subjectively for me, it is newer. Also, modem transfer protocols newer had the chance to mature enough to catch a case of creeping featurism, unlike FTP daemons. Perhaps the age of programs, protocols and other software concepts should be counted by version numbers and the number of implementations and not by first release date. As things mature too much, they lose any elegance they might first have had.

Another metric of age for software that to a great degree agrees with "vanhassa vara parempi" is that for what kind of computing resources the program is designed for. The most power-hungry programs for a similar task also tend to be the worst. TeX and LaTeX that were written for 70's and early 80's computers and thus have some rather annoying limitations. Nevetheless, coupled with a simple text editor they still hands-down beat any word processor that won't run except on the latest hardware. And as for text editors, I'm not a fan of emacs that in the old days earned the backronym Eight Megabytes and Constantly Swapping for a reason. The Unix command line beats WIMP GUIs for similar tasks, and so do the UIs for many old DOS programs. Infact, I think there's much to be learned from DOS-era UIs, but I'll leave that to another article. I could go on with these examples, but I think the above ones give the point. There are exceptions, of course.

The problem of abundant computing resources applies to films as well. I don't like wannabe-photorealistic CGI-fests, and I don't mean just the plot or lack thereof. What I mean is that the animation and rendering just isn't good enough to be watchable. In the old days, when special effects were more expensive and worse than today, you used them more subtly, and with more skill. Not so anymore. Possibly thanks to a smaller budget compared to cinema productions, space sci-fi series fortunately still mostly use CGI for cut-scenes of technological objects, and it works well there. But once you try do anything living with wannabe-photorealistic CGI, crap is what you get. Of course, the situation may change with a huge increase in computing power and modelling skills, but at the moment extensive use of CGI is just annoying.