Bobulate


Archive for the ‘Bla Bla’ Category

Back in time

Monday, July 26th, 2010

Back in time. That is, back (from vacation) in time (to avoid the long drizzly tedious soaking rains of today). I was off for a week, bicycling with the family in the Netherlands. We did not go camping this time, just to "trekkershutten", which I suppose can best be translated with "simple holiday home for a traveling family". The week was hot and sunny, which was just the right kind of weather for us. This year the kids had to ride their own bicycles — no more tow or kiddy seats! For both Mira and Amiel 25km per day is really the maximum they can handle. On 16" and 20" wheels that’s still a long ways to go. Amiel just pedals along fanatically, at a steady 9.5 km/h. So while it’s a real distance and endurance achievement for a 5 year old kid, mom and dad’s main challenge is cycling that slow all day.

So today was rain all day, and I’m glad we didn’t have to ride through that. Catching up on a week’s worth of email (#1: I really need to install Kontact on my n900; #2: expensive hotels in Amsterdam charge you 7EUR an hour for internet, while cheap hostels in the middle of the forest offer wifi for free) and trying to ignore all but the most urgent for now.

A bit of Postfix and Darcs

Wednesday, July 14th, 2010

Somehow the official Postfix documentation continues to be intractable for me. The whole reason for getting Postfix up and running on my local machine was because of Darcs. Darcs — one of many distributed version control systems, but possibly the only one written in Haskell and intrusively interactive — likes to send patches by email. It just uses the local MTA for that, so I needed one. Setup is slightly complicated by the situation my home workstation is in: while I have a domain I’d like to use for email going out of my house, I’m on a DSL line with outgoing SMTP blocked except to the DSL provider, and my local usernames don’t match the email addresses I’ve created.

So, basically I want to send all locally submitted mail to my DSL provider’s SMTP server, with the right domain attached, without my hostname, and I want to re-map my usernames. I didn’t manage to distill this from the actual Postfix Standard Configuration Examples, but Ralf Hildebrandt’s configuration examples do have the right stuff. The only change I had to make was for the username re-writing; instead of virtual_maps (since superceded by virtual_alias_maps), I used canonical_maps = hash:/etc/postfix/canonical, which maps user and user@localhost and user@domain to email.address@domain (3 lines per user login).

Now darcs send does what it should. I can also point KMail to use local sendmail and that gets the job done, too. KMail is actually simpler, because it sets up the outgoing From and envelope addresses based on the identity in use, so it doesn’t need the rewriting at all.

I’m starting to find Darcs a little bit interesting; the idea that every patch is a branch based on what it actually needs is intriguing. Or to put it in other words: if I have one patch to file A and one patch to file B then I’ve created two branches; there’s no need to consider the two together. If I make another patch to B then that branch grows longer. My working copy displays all the changes from the branches I’m working with, that is the merge of all my branches. If I were to add a patch that changes both file A and B then the two branches merge there. Darcs uses tags to slam a line across all the files and all the patches and to merge all the open branches, which prevents a gigantic proliferation of possible branches.

The upshot is that when I do ‘darcs send’ I have the option of sending any branches I have open — and any sensible sequence of patches within the branch — so that I can very carefully upstream patches while keeping them all visible locally.

What I really miss in Darcs is some of the Mercurial workflow: seeing a graphical representation of the current branching structure (hg glog) and being able to quickly merge multiple commits into a single one for upstreaming (hg qfold). On the other hand, the darcs behavior on commit (darcs record) to ask you about every changed hunk is really nice (sometimes, at least) because it helps avoid accidentally committing debug bits or whitespace changes that you didn’t intend to do.

[[ ObKDE: and somewhere underlying this all is my need to get a Python application to talk to KWallet; why oh why didn’t I have enough lunches with Richard Dale in Finland? ]]

Home from Akademy

Sunday, July 11th, 2010

I snuck away from Akademy on Friday morning. My intention was to sign some legal documents (part of a resolution of the AGM of the KDE e.V.) and say good byes to all and sundry, but that got terribly sidetracked. The usual experience of walking into Demola is people saying “Hey, [ade], I need to talk to you.” I don’t imagine this is unique to me — there’s so much coordination that goes on at Akademy when you finally have every sub-project on hand to chat with. So I ended up with a long talk with Elias about truth in advertising, and then I tried to print and sign and scan the document at hand. Kaare, a guy I’d exhanged some banter during the day trip, wandered over. It turned out that Kaare is the Skanlite dude, so I took the opportunity to thank him for his work.

Then rushed goodbyes — I skipped the whole of floor 4 with the BoFs — and off to the bus. Milian, Niels, Richard and Lubos were on the same flight, and everyone who flew onwards from Helsinki to Amsterdam had their luggage left behind. So it was 9pm before I got home, sans backpack. Like Harri said, it’s not so bad on the way back. My luggage was finally delivered at home at quarter past eleven in the night (darkness!) in a violent thunderstorm.

So, yeah. Akademy rocked. The Mexican has pictures that give a good impression of the atmosphere there.

Upcoming Conferences: Linux Kongress

Wednesday, July 7th, 2010

Linux-Kongress 2010, Tue, Sep 21 to Fri, Sep 24 at Georg Simon Ohm University , Nürnberg, BAYERN, DE

[quote] Linux-Kongress is by far the most traditional Linux conferences with a focus on development and cutting edge topics. GUUG will organize in 2010 the 17th edition of this event, first started in 1994 in Heidelberg, Linux-Kongress made its trip through a variety of German cities, Netherlands and to the United Kingdom (LinuxConf Europe). Since its start 17 years ago Linux-Kongress has been evolved into the most important meeting for Linux experts and developers in Europe.

The conference focus is kernel and lower-level technologies and things like desktops don’t show up in the programme, but there’s a couple of intereresting bits for the creators of user-level technology, I think: performance tools and IP stack conversions. The former would be interesting to apply to KDE bits in general; the latter will probably just need to be done once at the right level (Qt) to have everyone switched.

Completely unrelated: There’s also a bunch of fairly nice words at the Register about OpenSUSE 11.3.

Akademy D-2, in Tampere

Thursday, July 1st, 2010

Arrived in Tampere this afternoon, found Hotel Ville (if you take bus 61 from the Airport, there’s a stop in Hatanpaa right close by), heading out now to meet with the local team. I’ll also find out who is the boss this week: Claudia, Ilkka or Sanna. [[img src=CrudelyDrawnKolourpaintVersionOfIIsAtAkademy]]

Last minute preparations

Wednesday, June 30th, 2010

The old adage "if it ain’t broke, don’t fix it" applies, but nonetheless I felt the need to fix my laptop. It was cluttered with FSFE materials that I shouldn’t be carrying around, for one thing, and the Kubuntu 9.04 on it was decidedly long in the tooth. As prepwork for Akademy (o harbinger of doom!) I decided to clean it up: one Linux install, one OpenSolaris. This is an MSI 620GX laptop, which is a Centrino 2 based machine. Hardly exotic stuff.

For social reasons — as in, Sebas had recently written glowingly about it — I started off with OpenSUSE 11.2. Installs nicely (but with GPT by default, it seems?) and delivers a good-looking KDE4 desktop, plenty of apps. Compositing was enabled (GeForce9600M GT). Setting up a devel environment was a mild challenge. For various projects I use svn, git, mercurial and darcs, so getting those is a first priority. Darcs was a little harder to get, but there is a package available, which I downloaded and installed manually. It was in one of the repositories, but I didn’t feel like setting that up for one package.

It’s when I tried to suspend to RAM or to disk that issues started showing up. Suspend to RAM fails, saying that the machine is unknown and not whitelisted. s2ram -f puts the machine to sleep, but it doesn’t resume. Similarly, hibernate (suspend to disk) works but doesn’t resume. I still need to send in the info for that, but after a half hour of fiddling with it — and knowing that Kubuntu 9.04 could suspend and resume on the same laptop, I gave up. Since I’m not particularly attached to whatever Linux I’m using, time to try something else.

Kubuntu 10.04 is what I’m running on my desktop — which has ATI graphics — and I appreciate that it starts up really quickly, etc. Vaguely annoyed at the microblogging thing it puts on the desktop by default, but that’s terribly minor. Installing all the dev tools was easy on the desktop. On the laptop, though, I didn’t even get that far. The nouveau driver included on the install CD doesn’t like the video card — and so the installation process bails out to a text screen. Folks in #kubuntu were helpful and ready with some suggestions, like nomodeset and using the vesa driver (hung the machine on boot). Running X -configure from the text login hung the machine too.

Fedora 13 up next. No compositing with the nouveau that is included — that’s in the mesa-experimental package, it turns out. Devel environment is easy to get, with all the version control systems one install command away. Of course, the first thing I tried this time was suspend and hibernate: both flawless. External monitor — important for presentations at Akademy — pops up a dialog with simple configuration. There’s one third-party application that I use that requires 32-bit libraries. Getting those was straightforward after finding out the package names with "yum provides ‘*/libraryname’". I see that they’ve also customized Konversation to go to multiple useful channels, rather that just the distro-channel.

So, it seems I’ll be presenting at Akademy from a Fedora-based laptop (Rex, Kevin, a beer is on me). All I need now is the latest Air-themed LaTeX templates and I’m good to go.

One thing I’m left with is why three different Linux distro’s, all relatively recent, behave so differently on a fairly conventional platform like this one. The technology is there; it was even there last year. Where do these regressions come from?

Free(BSD) Graphics

Wednesday, June 23rd, 2010

Sebas wrote about Free and fast graphics a while back, which has led me to try the same on FreeBSD. I have a similar setup, with two monitors attached to a Radeon 4350. I’ve written about the 4350 on FreeBSD before. The minor challenge this time around was to get compositing working with a dual-head setup and to get the dual-head setup to show up on startup. Perhaps there’s a KDE-ish way of doing this, but I ended up setting up a xorg.conf with the desired layout. Without an xorg.conf, I’d get nicely mirrored displays, which I could manipulate sensibly with xrandr, but I couldn’t properly place the one monitor beside the other — xrandr kept complaining that my maximum screen size was 1650×1650.

So, manual configuration it is. I’m using X.Org X Server 1.7.5, Release Date: 2010-02-16, built on FreeBSD 8-STABLE, amd64, using the radeon driver v.6.13.0.

Thanks to the much improved auto-configuration in Xorg, the configuration file doesn’t have to be very long. The bits needed were a Device section, two Monitor sections, and a Screen section, like so:

Section "Device"
Driver "radeon"
Option "Monitor-DVI-0" "Monitor206"
Option "Monitor-VGA-0" "Monitor430"
EndSection

(I’ve left out the Identifier and BusID lines, among others — they’re not relevant; I tried the radeonhd driver, but that didn’t yield quick results). The important thing here is the explicit identification of monitor identifiers with XRandR outputs — I have a Samsung 206S monitor connected to the DVI output, for instance. From here, we go to the two monitor sections. I’ve set preferred modes on both for their native resolution. When I didn’t, I got them both at the lowest common resolution, 1280×1024, which was ugly as sin.

Section "Monitor"
Identifier "Monitor206"
Option "Primary" "true"
Option "PreferredMode" "1680x1050"
EndSection
Section "Monitor"
Identifier "Monitor430"
Option "RightOf" "Monitor206"
Option "PreferredMode" "1280x1024"
EndSection

You can see that the 206S is my preferred monitor — this sets it up as the main monitor, which means that the KDE startup thing shows up there and most notifications do as well. The other monitor — a six year old Iiyama E430 — is off to the right, and houses Konversation and Akregator and other non-essential attention-grabbers. The difference in color reproduction between the two is striking, so much so that I give the second monitor a very different background just so I notice the color difference less.

The last bit is setting up the Screen for X. Here I’ve left out Identifier, Device, Default depth; the important bits seem to be the monitor and the Display, which define a primary monitor (again?) and the side of the desktop. In this case, the desktop size is the sum of the horizontal widths and the greater of the heights.

Section "Screen"
Monitor "Monitor206"
SubSection "Display"
Depth 24
Virtual 2960 1050
EndSubSection
EndSection

Unlike Sebas, I don’t seem to have OpenGL compositing in this setup. That could be because of the card, the software — Sebas points out that it’s a little finicky with versions of kernel and video drivers — or something else. Switching to XRender gets me something that’s good enough.

*blink*

Sunday, June 20th, 2010

Whoa, you blink and all of a sudden two weeks have gone by. This is probably related to ongoing health issues (I continue to be very susceptible to every cold germ out there) and by me focusing my attention on two things: the garden (weeding, planting out beets and sweet peas and weeding and watering and hoeing and picking the first strawberries of the season) and fixing up api.kde.org. The api site is interesting because there’s so many different scripts involved, some historical, some broken; in addition the mechanisms for generating the dox themselves have changed over the years, so it’s not entirely trivial to generate new 3.5 documentation now. I copied over those archives I had and then tried to get KDE trunk (what will be KDE SC 4.5) up and running again. In the process I discovered a zillion broken links, got distracted by plenty of bad documentation, cursed perl from here to infinity, designed a rewrite of the tools in python and then got distracted by the garden again.

And then it was Father’s day, and I got a book titled “1001 languages”, which gives an overview of the world’s natural languages. Fun stuff, as it dives into grammatical and structural analysis of language groups as well.

Lest I blink again and not get around to blogging again before mid-july, I should add:
I'm going to Akademy
I’ll be giving a talk at Akademy about legal issues: where are we with the FLA, how do we see patents affecting KDE, what about copyright in general and where does KDE live in the landscape of Free Software and the Open Source world.

Remote DVD drives

Tuesday, June 8th, 2010

The EBN server — the real one, the box in a data centre with dual power supplies and lots of other jazz — is about to be re-fitted with OpenSUSE on it. That means swapping some drives around. At least, that assumes that the machine actually supports running OpenSUSE on it. It’s a dual-cpu, dual-core, 8GB Sun X4200M2, which seems like a reasonable target to try, and what I’ll do is swap out the existing disks and swap in the disks from my home test system (which is a one-cpu, dual-core, 4GB system). However, I wanted to try out the new OS on the machine before actually bicycling all the way to the data centre at the uni.

Enter, once again, Sun’s ILOM.

Man, this stuff is great. The console redirector can not only show me the video output (from the 8MB ATI RageXL or whatever bit of cruft is in there) and give me a local console keyboard, but it can also redirect the DVD drive. I consider that pretty amazing. Consider: I have an ISO image of OpenSUSE at home. Right now the server, 9km away, is booting off of that ISO image — I don’t need to have a physical CD, or be near the machine at all. Sure, it’s sucking up all the bandwidth on my DSL to transfer blocks from the ISO image to the server, but it’s starting .. and then panicing into a reboot some time after starting the Linux kernel, so let’s try that again in the morning 🙂

Hugin and patents

Tuesday, May 18th, 2010

It’s interesting to see Hugin show up twice in one day: Ken Wimer describes how he produced the UDS group photo with it — he focuses on the use of Hugin and the user interface — while Michael Kesper has made a panorama in Tromsø. But Michael points out an interesting issue with Hugin: a patent that may apply to the panoramic-stitching algorithm. Hugin prints a warning when using that tool, apparently.

This gives me an opportunity to talk about patents a little bit — a refreshing change from OpenSolaris packaging, even if I spend more time on the latter these days.

So, first off, the GPL (version 2, but version 3 has similar language) has an interesting clause about patents that many people forget about. It’s clause 8:

8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Program under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License.

So supposing you knew in advance that a patent applied to an algorithm used in your piece of GPLv2-licensed Free Software, you could exclude those areas where the patent applies. In other words, you could have something like “Hugin is Free Software, but may not be used in the United States as a consequence of US patent #N”. A surprising idea, no? That’s one special case where Free Software isn’t necessarily Open Source software (the Open Source definition disallows restrictions by country).

Source code as such (certainly in the case of compiled languages) does not implement an invention — it merely describes an algorithm. As such, I believe that the source code of a Free Software implementation of some patented algorithm (as if it makes sense to patent an idea, but hey, some patent offices hand these things out) cannot infringe the patent even if you accept the validity of algorithmic patents because it’s just describing something that has already been published — the patent text itself! On the other hand, compiled forms of the same do implement the algorithm in a machine and might be covered. I’m not sure if anyone has really dug into the implications of the division between source and object code in this area.

In cases like this, the Open Invention Network might be of use. It’s a patent pool organization for Linux. Since Hugin isn’t part of Linux (as in, the kernel bits) it’s unlikely to be helped out directly. The OIN folks are some of the most pragmatic and sensible people I’ve talked to about the effect (negative) software patents have on us all.

A brief patent search (ah, for people who think this might disqualify me forever from participating in development, I’ll point to Andrew Tridgell’s talk on reading patents — also on OSNews) didn’t turn up anything filed by the University of British Columbia that seems to apply. However, it might have been way too brief a search, as there’s darn little to go on based on the warning message from Hugin. To do it right the warning would have to be far clearer, possibly pointing to the actual patent number. Otherwise, calling the UBC patent licensing office (actually, you need to email Greg Lowe directly) for information is a little difficult.

One article on SIFT (Scale Invariant Feature Transform?) panorama stitching can be found at Springer, although the research page for SIFT from Greg Lowe is more interesting. It at least lists the exact patent number and stuff like that. Frankly I’d rather read the research papers than the patent text — at least the papers have as goal to share knowledge with the research community, as opposed to obfuscating the invention to make it broadly applicable in courts of law. However, one could apply the mechanisms Andrew Tridgell describes to the patent, and develop a stitching algorithm not covered by the patent simply by not doing something from the method described there. For instance (I’m not an image processing guy here) finding difference images in a different way.

Kudos to the patent writer, anyway, for claim 20 “A computer readable medium comprising codes for directing a processor circuit to execute the method of claim 1”, so that the patent covers the method, apparatus for implementing the method and computers doing the same. That’s surely a convolution caused by the way the patent system works (again, Tridgell: if the patent didn’t cover computers explicitly, then you could argue that implementing the method with a computer is not what is claimed, ergo you’re free).

Ugh. Too much time spent trying to understand the whole “this is my idea and you can’t have it” (as opposed to expressions or performances of an idea) culture.

While looking for the SIFT patent, I did find US patent numbers 7,639,897 and 7,711,262 which both cover guiding a user of a digital camera in making a panorama photo. They seem awfully similar to me, although obviously there’s a giant difference (sarcasm doesn’t work in writing unless Penny Arcade does it) between sweeping a scene and then re-photographing it and indicating already-photographed areas as the scene is swept. I guess there’s no patent yet on not helping at all.