beid security

Belgian electronic ID card and security

I received some feedback on my previous BeID post, where I explained how to log in to a remote system by use of your digital ID card.

Philip and Guy both claim it possible or likely that the crypto keys on eID cards are stored on a system of the government somewhere and that you therefore should not be using them.

I find this to be jumping the gun a little. Security is always a trade-off; indeed, if you want to hide stuff from the government, then using eID-based authentication might not be a very good idea. However, there are other cases where eID-based authentication really is the correct option; and it is nice to know that this kind of thing is possible. Security always is a trade-off; and if you make an informed choice, there's nothing wrong about giving someone else access to your server.

The fun bit is that Philip actually advocates carrying a clear-text printout of a list of still-valid passwords. Now that, I feel, is unacceptable, security-wise. Personally, I'd rather use a smartcard that only some secret government agency might be keeping copies of rather than a list that any random thief might make use of. The question really boils down to this, I think: how likely do you think it is that someone else will be able to get at your password by guessing or stealing it?

  • Using a password that's related to your person or someone near you, or is based on a dictionary: rather likely. There are scripts out there that will guess these passwords. Don't use them.
  • Using a printout of a list of one-time passwords: somewhat less likely (they can't be guessed by script), but still not very hard (wallets get stolen by the hundreds every day, and it usually takes a while before one notices)
  • Using smartcards with keys generated by someone else: somewhat unlikely (there may be a few people who know my private key, but one can't be sure of that; and, color me naive, but I tend to trust the government to abide by its own laws—for the most part—laws that prevent this kind of behavior)
  • Using strong password: rather unlikely, provided you take care of them.

What I mean by that last bit is that you should always generate your passwords with a tool that generates fully-random passwords ('pwgen -s', for example), and never write them down in cleartext anywhere. That of course makes it slightly hardish to memorize them; what I do is to store the password encrypted to my GPG key on my laptop, and I memorize the password by changing it on a remote server that I need to access fairly often (Debian's systems will do), but leaving my GPG passphrase and laptop password as is for the time being. Every time I need to log in to the remote server, I first try to remember the passphrase. If I can't remember it, I look it up in the encrypted file (again taking care not to store the file on disk). Once I start being able to log in to the remote server without having to look it up in the file, I change all my other passwords. This process usually takes a few weeks. Then, the only way someone can get my password is either by coercing me or by killing me and disecting my brain. And then still.

Unfortunately, in the real world, I realize that many people do not wish to go through all this trouble just to get a secure login, and instead just choose a weak password. I guess that's why I think the next best thing might be slightly better than using strong passwords...

Posted
why I dont want an i something II

Why I don't want an iSomething, part II

Bjorn mentions that he doesn't want the nokia internet tablet, because it doesn't have a phone and he doesn't want to carry two devices with him.

portable media player

Bottom left, my portable media player. 4G solid-state storage, plays movies (the display shows a scene from Star Trek: Deep Space Nine that I converted from DVD), audio, FM radio, and after a firmware update can do games and some random applications. Does bluetooth. Does not do phone calls.

Upper right, my cell phone. A simple Nokia 6021. Does bluetooth, makes phone calls, and (as the display somewhat shows) can do calendaring, too. With SyncML, I can synchronize it with my laptop.

Together, they perform anything I'd possibly want from a smartphone (and don't anyone tell me 'SSH', because doing ssh on any screen smaller than, 7" and without decent keyboard is laughable).

Together, they're smaller and weigh less than a smartphone.

Together, they're less expensive than a smartphone (€125 for the phone, €85 for the portable media player).

Why does anyone buy a smartphone?

Posted
nieuws

Nieuws

Frank had het eerder over het feit dat deredactie.be een vreselijk lelijke website is, met een hatelijke interface die op niks trekt. Dus hield hij zich bezig met een andere website waarop je 'gewoon' het journaal kunt zien. Zijn blog post vertelt het belangrijkste: atom-feed parsen, die verwijst naar andere atom-feeds, die verwijzen naar flv, mp4, en wmv-bestanden in verschillende kwaliteit, enzovoort. Hij maakt daar RSS-bestanden van, en daar wordt dan nog een web-based FLV player rond gezet.

Heel fijn allemaal, maar dat werkt niet op mijn laptop (een PowerPC G4 die Linux draait), en kan voor zover ik weet ook niet full-screen afgespeeld worden.

Dus dacht ik, laat ons het eens zelf bekijken.

#!/usr/bin/perl

use strict;
use warnings;

my @url;
my $i;

open MAINSTREAM, "wget -q -O - 'http://www.deredactie.be/cm/de.redactie/mediajournaal?mode=atom' |";

while(<MAINSTREAM>) {
	if(/link rel="self".*title="([^"]*)".*href="(http:.*mode=atom)/) {
		my @l = ($1, $2);
		push @url, \@l;
	}
}
close MAINSTREAM;
my @l = ("newswire", "http://www.deredactie.be/cm/de.redactie/newswire?mode=atom");
push @url, \@l;

print "Programmas:\n";
for ($i=0;$i<=$#url; $i++) {
	print "$i. " . $url[$i][0] . "\n";
}
print "Welk programma? ";
my $prog = <>;
chomp $prog;
if ($prog =~ /\D/) {
	die "Eh, numbers please!";
}
if (!defined($url[$prog])) {
	die "Eh?!";
}

open PROGSTREAM, "wget -q -O - '" . $url[$prog][1] . "' |";
open OUTPUT, "> " . $url[$prog][0] . ".m3u";
while(<PROGSTREAM>) {
	if(/(http:.*h263hi.mp4)/) { print OUTPUT $1 . "\n"; }
}
exec("vlc -f --video-on-top --playlist '" . $url[$prog][0] . "'.m3u");

Et voila.

Duurt een halve seconde tussen filmpjes door, maar dat vind ik zelf geen probleem. En zo ben ik tenminste van die vreselijke webbrowser af.

Posted
free-metro

Reading the "Metro" on my Hanlin V3

It was recently brought to my attention that the free newspaper "Metro" has an online version, where you can download the entire newspaper in PDF form; or use it in an online version, which consists of a series of images of each page, where if you click on an article, the usemap of that image is a link to the relevant article in HTML form.

Of course my hanlin doesn't have a webbrowser (it has a somewhat limited HTML parser, but that does not understand even hyperlinks, let alone usemaps), so the PDF version is what I need. Unfortunately, the only option which "Metro" provides is a set of one-page PDF files. They are somewhat readable on my hanlin (someone with worse eyes than my own would probably have trouble reading the small letters, but for me it's not a problem); however, the fact that rather than just using the 'page turn' buttons on my hanlin I have to close the current file, open the next, and reconfigure the zoom all over, means that this multi-PDF thing isn't exactly great, and that I'd prefer just getting one PDF file instead.

So I got to work, and tried out a few things.

There doesn't appear to be anything to merge multiple PDF files into one. There is 'psmerge', but that only does PostScript files, not PDF ones. But that's not a problem, because PDF files can easily be converted to postscript and back, right?

Well, no.

The first pdf-to-postscript converter that I tried was 'pdf2ps', a wrapper script around ghostscript. Unfortunately, using that results in some ugliness:

original PDF
source, as rendered by Xpdf output of
pdf2ps, as rendered by gv

As you can see in the second image, using ps2pdf results in some quality loss. Ghostscript apparently doesn't understand the letter 'A' very well, and (more importantly) loses the anti-aliasing that is part of the pdf file. Converting this file back to PDF and storing it on the hanlin results in an unreadable text.

But it gets worse. Once I had converted all those PDF files to postscript using pdf2ps and merged them with psmerge, the output was gibberish. Or Klingon, take your pick:

psmerge
output, as rendered by gv

In case you were wondering, yes, that is the exact same fragment (counting lines sucks) at the exact same zoom level. Clearly this was a dead end.

I found that there was a second pdf-to-ps converter, called 'pdftops', which uses the xpdf code base to do its thing. This converter produced clearly better PDF output; I could not see any difference between the Xpdf rendering of the original file, and the gv rendering of the pdftops output. Also, the psmerge output does not garble things as badly:

better psmerge
output, as rendered by gv

Sadly, it still loses anti-aliasing, and thereby the readability of the document on my ebookreader device. Moreover, the pdftops output seems to confuse psmerge, to the extent that it hangs on some pages.

Another approach I tried was to import the pdftops output into scribus, and create a multi-page PDF file with that program. Unfortunately, however, scribus does not seem to like the pdftops output.

Xpdf also has a 'pdftoppm' converter. With that, I was able to create a multi-page .pdf file that was not garbled, and did still contain anti-aliasing. Unfortunately, since the PPM format is a raster image format, the PDF file that is created in this way does not scale very well, resulting in artifacts and, again, an unreadable PDF file on the hanlin device.

So it appears that for now I'll be stuck with storing multiple files on my device. Sigh. I wish that wouldn't be necessary...

Update: So I needed pdftk.

Posted
autotools schell

Autotools and shell

Autotools can be nice, but they can be pretty ugly too, sometimes. Especially when you're trying to do something that the autotools weren't originally meant to do.

I was trying to generate scripts using autotools, in such a way that they'd be able to use variables that I've AC_SUBST'ed. At first, that did not seem possible, unless I wouldn't mind doing what autoconf.info suggests:

edit = sed \
	-e 's,@datadir\@,$(pkgdatadir),g' \
	-e 's,@prefix\@,$(prefix),g'

autoconf: Makefile $(srcdir)/autoconf.in
	rm -f autoconf autoconf.tmp
	$(edit) $(srcdir)/autoconf.in >autoconf.tmp
	chmod +x autoconf.tmp
	mv autoconf.tmp autoconf

autoheader: Makefile $(srcdir)/autoheader.in
	rm -f autoheader autoheader.tmp
	$(edit) $(srcdir)/autoconf.in >autoheader.tmp
	chmod +x autoheader.tmp
	mv autoheader.tmp autoheader

(sic, including the "autoconf.in" in the autoheader stub. Dunno whether that's a bug or whether autotools is really that ugly)

This would work, but it has one downside: it can replace instances of "@datadir@" with the likes of "${prefix}/lib/${PACKAGE}". Or, if the user is really sadistic, "${my_stupid_variable_specified_on_configure_command_line}/lib/${PACKAGE}". Personally, I do not like that approach.

So I had to find something else. With C programs, if you want to AC_SUBST something that could end up being something like the above, you'd just make sure you'd have something like -DFOO="@FOO@" in your AM_CFLAGS (or your program_CFLAGS); this would then evaluate your AC_SUBSTed variable, making it available (the proper way) to your C program. Occasionally, this is why it can sometimes be a good idea not to have a config.h...

Anyway, the solution to my little problem was to be found that way. First, I have a little sed program:

#!/bin/sh
sed -e '/@config_sh@/{
	rconfig.sh
	d
	}'

Or: 'replace all occurrences of @config_sh@ in the input with the contents of the file config.sh'. Simple. All we need now is to generate a config.sh file:

config.sh: Makefile
	env -i FOO=@FOO@ BAR=@BAR@ > config.sh
%: %.shin config.sh
	$(srcdir)/makesh < $< > $@

The first generates our config.sh file; the second uses the sed program above to turn any '.shin' file into a processed file. And now, my shell scripts only need to say '@config_sh@' somewhere appropriate to get access to all my AC_SUBSTed variables. Whee.

(Side note: I could probably run the sed inside the Makefile itself, but apparently my sed- and make-fu is not string enough -- whenever I try that, sed errors out one way or another. Oh well.)

Posted