Debcamp is nice. Talking to people in person rather than having to do IRC to get anything done is a breeze, really.

One thing I haven't been particularly happy about is the fact that there aren't any good statistics on what the build capacity of an architecture is; i.e., you don't know you're low on build capacity until you suddenly start backlogging and it's too late. So, since Jörg Jaspert, Mark Hymers, and Steve Gran were there, I asked them whether it'd be possible to add some data to the projectb database about installing a binary. One additional column with a default now() option later, and we now have interesting data that I can make statistics of. Of course, since that column was only added four days ago, during one of which the link to was down (meaning, no uploads), there's not much data there yet; but I can start making some graphs now.

Of course, the hard part is trying to figure out how to present the data in a manner that one can actually get useful conclusions from, which is harder than it seems. Anyhow, I'm trying. For now, on my public space on merkel, you can find a (mostly empty) per-architecture scattergraph relating the size of a binary package against the time between the dinstall of a source package and that of its binary for that particular architecture. If the time between a binary upload and a source upload is "often" more than a day for small packages, then it's obvious that the architecture in question is having problems keeping up.

It's not very useful (yet), but the graphs will be updated on a daily basis, so that hopefully one or two months from now, they will contain useful data. Note that the scales are fixed to go from 0.1 day to 1000 days, and from 1k packages to 512M packages, so as to make sure one can actually compare graphs.

I'm also trying to come up with a useful line-based graph, so that it is possible to compare architectures over time against eachother. This will probably involve something with averages and standard deviations or some such, I guess. Not sure yet.