Bobulate


Archive for the ‘KDE’ Category

BugSquad-ish

Friday, July 10th, 2009

[[ Some notes from behaving like a BugSquad member. Picture of the real deal on the dot. ]]

There was a BoF session on a timezone bug, where Pau Garcia y Quiles brought together folks running all kinds of Linux distros and other Free Software operating systems to examine system timezone setting. That has yielded lots of new information, but not yet a resolution. Changing system timezone on Solaris seems to require a reboot – that doesn’t seem right. I’ll have to play with the GNOME code to see what it does.

Re-tagged a bunch of old bugs that were not FreeBSD-specific to the “All” operating system after checking the bug still exists in current 4.3-rc1 on Solaris.

On monday evening, I wrote a BugSquad song. I guess those are the only 12 lines of original content I’ve written so far?

C#, see submarine

Tuesday, June 30th, 2009

There have been two posts about C# and mono on PlanetKDE this week (e.g. Richard and Andreas). The comments on Andreas’ entry are quite cogent, as are those replying to Richard, but it deserves a wider audience. As far as asking RMS at Gran Canaria this weekend, it’s worth a shot if you abstract the question away from specifically-C# and specifically-mono.

At issue is the notion of a Free Platform. Or perhaps an unencumbered platform. The latter is a weaker term because it does not stress the Freedom aspect of the software we write, but it might better express the uncertainty around what constitutes such a platform. There are many things that might encumber a software program or software platform. Dictionary check:

encumber 1. weight down, burden 2. to impede or hamper the function or activity of 3. to burden with a legal claim (as a mortgage)

Here’s a non-exhaustive list:

  1. Technical inadequacies (functional)
  2. Technical inadequacies (maintainence)
  3. Poor documentation
  4. Lousy management structure
  5. Licensing troubles
  6. Copyright management trouble
  7. Patents

To phrase the first four even more informally, that’s “it doesn’t work well”, “the code inside it sucks”, “i can’t figure out how to use it”, “the people working on it are jerks”. They are technical or organizational and well outside the scope of what the FSF (and sister organizations FSFE, FSFI and FSFLA) are normally involved in. Comparisons between C# and other languages on the first three points (Objective-C has a defrobnicator and C# doesn’t, nyah nyah nyah) miss the point. Project management is, I think, largely a personal choice and unless there are other indicators, I don’t think you can say “community management is better than having a single entity”. So contrasting Java’s Sun^WOracle^WJCP management process with whatever C# / mono uses misses the point again, except where it touches on those other indicators.

But then we get to licensing, copyright consolidation and patents, and these are the relevant indicators for determining whether a platform or software program is encumbered (vs. Free). For licensing, the question to ask is “is this a Free Software license?”. If you’re cheeky, you could check the Open Source definition as well. Mono is under a mix of GPLv2, LGPLv2.0 (not 2.1!) and MIT/X11, so the code of Mono itself and its derived works is Free Software (by this measure) and works linked to the libraries are not encumbered.

Copyright management is one of those topics close to project management style, but I believe that a project with a clearly stated management style — almost regardless of what that style is — is better-off than one with a confused, ambiguous or stupid copyright management style (an example of the latter being: publish all the source with no attribution and no copyright headers). In any case, the Mono project with its mandatory copyright assignment (to Novell) is clear.

Which is when we come to the last item on the list, patents. Software patents per se do not exist in the EU, but they do in the US. Other parts of the world I am insufficiently aware of. There is the notion of the “submarine patent” (a term whose use has mutated over time, see this article for instance about long-pending patents). In current use, it seems to be a patent that is not-well-known, but applies to some piece of technology that is incorporated into a standard (either official through a standards organization or as a de facto standard). That’s odd from a patent perspective, because the whole point of patent (dict: shown, open to public inspection, well-known) is to make sure that knowledge is pushed into the public domain at some point (that’s the social contract around patents). In any case, Aaron phrases it quite well:

there is a rather higher than zero chance of Microsoft taking advantage of its patents and coming down on C# implementations when and if it feels like it. they have an agreement with Novell, and Novell thinks it covers everyone but Microsoft seems to disagree. and that’s their public position.
however, we don’t know for sure. so it’s “only” a risk, not an absolute.

You will find similar statements about the risk entailed in using the possibly-patent-encumbered Mono .net framework in the comments to Andreas’ blog entry (the first comment, by STiAX, in particular).

So the whole issue isn’t about licensing, management or technical features, but purely about the risk involved with using a platform that is encumbered by patents. This is an issue on which the Free Software developer community has been either split or ambivalent for years. GIFs? MP3s? .NET apps? All encumbered at some point or in some way.

A similar kind of encumbrance would be if MIT (or Xorg) could retroactively re-license the X11 libraries to something proprietary (note: they cannot), thereby removing the platform upon which all Free Software X11 applications are built; it would be a risk, and given the importance of Free Software, a risk where the expected value of a manifestation is huge.

This isn’t to say there’s not other submarines in the water. We don’t know. Maybe we should. The known submarine should be treated with caution. And the side of caution is to treat C# as a non-Free platform to be avoided.

Postscriptum and prescription of the FLA

Monday, June 29th, 2009

The Fiduciary License Agreement (FLA) between KDE contributors and KDE e.V. is one that assigns those assignable rights derived from authorship from the original author to the fiduciary (i.e. KDE e.V.) and then assigns, non-exclusively, the rights on that work to (0) use, (1) study, (2) modify, (3) distribute and (4) authorize third-parties for the same, back to the original author.

Ugh, that’s a lot of legalese, but that is also why my slogan for the KDE project is “I talk to lawyers so you don’t have to.” It’s a wonderful thing that Sebas has been talking to KDE contributors at LinuxTag and has obtained a number of signed FLA documents. That means that a good chunk of important KDE code is now actually owned — in the sense of copyright — by KDE e.V. Sebas quotes our friend Carlo Piana (he received the pineapple fortune cookie award for Coolest Lawyer, once) describing an FLA as follows:

The fiduciary licence aims at simplifying this process, by assigning the copyright to an entity as KDE e.V. which is not “scalable” and therefore provides sufficient safeguards as to the possible hijacking of the project for nefarious reasons.

Now, I’m not entirely sure about that “scalable” there. KDE e.V. is scalable, in the sense that with individual donations and supporting members we have the resources to support the growing developer and contributor communities as well as serve users in general though efforts like UserBase. I think what Carlo meant is “saleable”, in the sense of “you can’t buy a community.” I’ll have to ask him, next time we meet.

So you can’t buy a community and you can’t buy a non-profit association with strong checks and balances in its constitution. This is good, and having a strong copyleft Free Software license applied to the software as well ensures its long-term availability (don’t let that link fool you, though: the KDE platform libraries are LGPL licensed, so you can, if you really feel it is necessary, write proprietary applications on top of it — but consult your local counsel for license advice). The main issue that the FLA tries to solve is really one of license and responsibility fragmentation.

When multiple authors work on something, then each author has a share of the copyright on the creative work — at least, each author who contributes something original and creative enough to be considered a creative work. This leads to multiple authors and the requirement to agree on copyright matters between all the authors in a particular work. This fragmentation can consume considerable resources if ever there is a particular need to deal with all the rightsholders for one particular work.

Note that KDE contributors — all of them — have traditionally been rather lax in maintaining the copyright headers in our sources. We do not maintain a comprehensive list of authors in each file, nor do we follow GPLv3 article 5.a very well, in general. Figuring out exactly when 5.a applies is something I’ll leave for the real lawyers and another blog post. In any case, a consequence of the signing of the FLA’s by a number of authors is that for their work the copyright header should be changed to

Copyright [years] KDE e.V. <kde-licensing@kde.org>

Ideally we would include a postal address (of KDE e.V.) as well; the whole point of this exercise is to make it really darn clear who to contact for licensing information and to make sure that we clearly claim the copyright on these files.

Note also that the KDE licensing policy is lacking in some details and allows poor licensing hygiene by potentially mixing incompatible licenses: we have had license checks (Tom Albers has now and in the past been instrumental in moving that forward). Just because we’re not doing things optimally now doesn’t mean we can’t move forward and improve things (this applies in many fields of endeavour).

The FLA used by KDE e.V. has a big blank where you can fill in which works are covered by the FLA. There is also a pre-filled form (PDF, 50kB) which identifies the works using standard language referring to your SVN account name. That should make filling things in easier. If you didn’t sign up at LinuxTag, you could print that, fill it in, and mail the form to the KDE e.V. office. We maintain a list of signed FLA’s as well, to keep track of who has done so — let me emphasize that the signing of an FLA is optional and the choice to do so rests entirely with the individual whose creative work is covered (or would be covered) by such an FLA.

So, by concentrating the copyrights held we reduce fragmentation; given that we have a strong basis to build on with careful checks and balances in the consitution of KDE e.V., this is an improvement for the currently-hypothetical case that we would want to (or have to) relicense large parts of KDE to some other Free Software licence.

There are additional checks placed on any relicensing attempt on the part of KDE e.V. They were added as a sort of backup guarantee that KDE e.V. cannot do evil in relicensing code. However, at the same time these relicensing restrictions (written down in the Fiduciary Relicensing Policy) reduce the effectiveness of the FLA itself, because the FRP says that we at least have to try to get permission from the original author before relicensing. However, it does mean that we get to judge “reasonable effort” ourselves instead of letting someone else do it. So in the end we (as in KDE e.V., representing the KDE community as a whole) do have a stronger grasp of the code in order to be able to defend it if needed.

And, since the rights are transferred back in a non-exclusive license to the original author, the original author may fork or relicense if that’s really absolutely necessary. I should point out that that should be a real last resort and that working with the rightsholder (i.e. KDE e.V.) should be preferred. Remember, KDE e.V. exists to support (“we exist only to serve”) the development of KDE software, including the KDE workspace, KDE platform, and KDE applications. If there is some perceived need to fork, then somewhere there’s a misunderstanding of what the constitutional aims of the association are.

But I digress. There is an FLA, and it is signed by many people. Perhaps many more will do so at Akademy this year.

So where do we go from here? Maybe next weekend, we can take over the world.

The answer to this question actually depends on which hat I’m wearing. The KDE hat says: continue to consolidate licensing, pursue license checking and accuracy across the entire codebase and behave as an exemplary community software project with regards to legal matters.

My FSFE hat says that we need to take the concrete experience with KDE and with Bacula and introduce other projects in Europe and the rest of the world to this kind of lightweight legal housekeeping. The FLA has been translated into many languages, but I feel that having used it in KDE it could use a little extra precision. Also, any legal document intended for use by non-lawyers probably needs an implementation guide and HOWTO. And most importantly, those need to be well-known to projects who might need such documentation.

Template function parameters

Monday, June 15th, 2009

Started pushing patches produced in packaging KDE 4.3-beta2 on OpenSolaris to the KDE SVN repo yesterday. Started on some safe targets, like KPilot, where I know the codebase pretty well. One of my patches was “add a newline at end of file”, which is one of those kind-of-dorky fixes that sometimes needs to happen.

There was one interesting patch, though, which again illustrates how gcc stretches (or violates) the C++ standard for convenience. The issue if templates that take function parameters, like this one:
template <int (*f)(int)> class A { } ;
This is a template you can instantiate with a single function that takes an int, returning int. For instance, close() would fit the bill. Or would it? In C++, functions also have linkage — this could be either C++ linkage (default) or extern “C” linkage. In the example above, f() has C++ linkage. This means that you can’t use close() as a parameter to this template, because close() has C linkage.

With gcc you can. It glosses over this difference, but standard-conforming compilers will refuse to compile this. It’s apparently a confusing and contentious issue, given the number of blog entries (like this one) and forum posts dedicated to it. If it had been just KPilot, I suppose I would have just committed stuff and moved on, but template function parameters show up in Akonadi as well, so I suspect they will get more use as time goes on.

The point is you can’t write
template <extern "C" int (*f)(int)> class A { } ;
to specify the linkage of template parameter f. There are apparently two ways of working around this. One is to use a typedef to introduce the linkage into the function type, like so
extern "C" typedef int (*cfunc)(int);
template <cfunc foo> class A { } ;

The other is to introduce an intermediate function with C++ linkage (as expected by the original template) that calls the intended target function with C linkage, like so:
int _close(int fd) { return close(fd); }
// ..
A<_close> a(0);

Here we introduce an intermediate function _close that just calls the original function; the only difference is the linkage of the two. Making _close static inline should keep the overall impact low (I haven’t investigated).

Which approach you choose depends on how much control you have over the template definition. For templates defined by an external library, only the latter might be possible.

What’s the worst that could happen?

Sunday, June 14th, 2009

I wrote my previous bit on KDE4 OpenSolaris packages before all of them were done; I scheduled the post and left the packages building and went off to teach classes (networking and SSL technologies for high school kids). What’s the worst that could happen?

Well, for one thing a dependency on a non-released version of Soprano could show up — one that isn’t tested for in the cmake code either. I can understand this to some extent, after all there is a great deal of co-development going on between soprano and its clients. Fortunately, soprano has been tarballed since then and added to the KDE source download mirrors. I see on the KDE release management mailing list that there’s a couple more gotchas, but those will only show up once I get past KDEbase and pim.

The life-size stopper for KDE PIM on OpenSolaris right now is the behavior of nepomuk-rcgen, which just segfaults on startup. That effectively prevents KDE PIM from building, since it uses the latest nepomuk cmake magic. On the other hand, things like konsole do work.

One commenter was asking about sourceJuicer and p5i files. The p5i files are a new thing in pkg since OSOL 2009.6 — I don’t know if pkgtool sets them up correctly, or even who is responsible for making these things or where it is supposed to get the publisher information from anyway. Any hints on what should go in the specfiles to produce useful p5i files would be appreciated. As for jucr, these packages aren’t yet in any shape to push there — and there are various problems related to build-time dependencies as yet unresolved. For instance, you can’t depend on cmake as a build tool on the jucr build machines. Some of the KDE4 dependencies have been pushed there, but they are not building regularly. It’s easier to do so outside of jucr, where updates can be done faster (and of course, the spec file repo if you want to build everything yourself is on bionicmutton, as documented on techbase).

PS. Konqueror + certain proprietary plugins needed for popular culture (viz. YouTube) works, too. I’m impressed (by the Konqueror developers).

PPS. You can now install KDEgdm-integration normally with “pkg install”. Note that the fonts for Qt applications are butt-ugly; run Qt config and set up nicer ones, then run systemsettings to fix up KDE’s appearance. At some point I hope to figure out how to set better defaults for both on package install.

Itty-bitty note on IPS packages

Friday, June 12th, 2009

The KDE4 IPS packages that I’ve been working on have been rolled out at bionicmutton so if you are a brave soul, you could install them by adding that as a package publisher (pkg.bionicmutton.org points to the same). Something like:
pfexec pkg set-authority -O http://solaris.bionicmutton.org:10000/ kdeips
pfexec pkg install KDEgdm-integration

This will get you an /opt/foss and a /opt/kde-4.2 (the latter actually contains KDE 4.3-beta2). And you should be able to run bits and pieces from there, or even log in to KDE from the regular OpenSolaris login manager.

Getting the packages out there did have some issues. These are entirely networking related: I build them at home on one of my OSOL machines, then push them out into the IPS package server, which is running on a FreeBSD machine at the university. Since my DSL upload speed is low (about 50% faster than Jon) uploads of large files time out regularly. Pushing Qt or Boost is pretty much a no-go, as it will fall over after, say, 90MB or more. I suppose I could get on my bicycle and carry my laptop to a place where I could use the direct wired network to the server, but that would mean going outside.

Instead, I found — relatively well-hidden, which I find typical of Sun documentation — a wiki page that provides a nice-to-read HOWTO package which in turn points to a download page where you can get tools. The universal toolkit image is sufficient, and it contains pkg-toolkit/pkg/bin/archivepkgs.py, which will extract a package from a local repository and dump it to a tarball. In essence it defines an on-disk format for moving packages around, and you can move the tarball to another IPS repo, unpack it, and it can be served from there.

This means that I can build, publish to a repository on localhost (no network timeouts!), dump to a tarball, scp to the actual server (scp is a lot more robust in that sense than python scripts doing web-services over slow links), unpack, and there it is. It’s still a 40 minute upload of Qt, but knowing that it will succeed is important.

Thanks to Kohsuke Kawaguchi for writing the original tarpkgs.py script and documenting it on his blog.

KDE Forum – Congratulations

Thursday, June 11th, 2009

Some months back, the KDE forums were re-launched under a community banner (they had previously been run by a third party), with a new and enthusiastic admin team. One of the first things the team achieved was the release of the forum software under a Free Software license (it was previously proprietary). And they haven’t sat still since.

One of the signs of the professionalism of the forum admin team (professionalism in the sense of upright behavior, reporting, being responsive end friendly and having a clear sense of purpose even in a volunteer organization) is the forum staff blog which posts staff goings-on or highlights particular topics. It’s missing the notice that one of the team needs to leave (for personal reasons), so I’d like to take this opportunity to thank Rob L. particularly for his work in re-invogorating one of the communication channels KDE users have amongst themselves and with developers.

Personally I’ve never liked forums. I remember using the phrase “loathe and abhor” in some discussions around running a KDE forum at all, but that doesn’t take away the fact that many people do like using forum software for discussions (instead of mailing lists or usenet). But one of the things the team has done really well is to make the forum useful even to grumpy old farts such as myself, by managing the forum and reporting on the results of forum discussions — that’s professionalism again.

A cool feature that results from this is the KDE brainstorm (announcement on the dot). Here’s a place to hammer away at ideas and mold them into something sufficiently concrete that it makes sense to push into KDE’s bug tracker as a wishlist item. Bug trackers are not a good place for conversation, after all. By consolidating “discussion for new features” in that place, we can add structure — and avoid having the same discussion elsewhere, like in response to release announcements (all of mark’s wishes on that page make sense in the brainstorm or as specific wishlist items in the bug tracker, but they’ll get lost as comments to a news item).

Best of all, there’s a monthly summary which adds review, statistics, triaging and some meta-comments to the results of a month’s brainstorm: this is a really good way to “listen to the people” (for developers) without getting bogged down in discussions. In a sense we are reaching (for) the point where we have a sales dude (e.g. Sayak) who is canvassing the customers for features and returns to the product development team and says “we should do this, because people are asking for it.” Then we follow the still traditional Open Source development model to see if it actually gets done, but the difference is in the channels used to get ideas.

:So congratulations, forum team, on building up something vibrant and useful.

[[ One last thing, though: good ideas are in far more plentiful supply than time and developers and artists to implement them; twenty ten-minute jobs will, in spite of the naive math, chew up a whole week of developer time. There comes a point when even the best of ideas will get stalled for “no time” — so roll up your sleeves and exercise your Free Software rights (for instance as granted by section 2 of the GPLv2). ]]

IPS on FreeBSD

Thursday, June 11th, 2009

Right. So here’s a foolish plan: I want to publish some OpenSolaris packages to the world. They are not ready for SourceJuicer (submissions there are slightly complicated, and the forest of KDE packages I’m producing is tedious and time-consuming to import there). Last time I tried that publishing thing, I did it on a package server on an OSOL machine at the end of my own DSL line, which pretty much immediately spanked my bandwidth completely. So I want to do it on a machine with better connections.

It may surprise some that all the real servers I have run FreeBSD. I use OSOL as my main desktop operating system. There must be historical reasons for this. In any case, that means building the OpenSolaris IPS package server pkg.depotd on FreeBSD. It turns out to be relatively straightforward, as most of the actual server is written in Python (yay!) and there’s only a few bits of C in there. So my approach was:

  • Fetch the sources for pkg as documented on the pkg project page (thank you Alan Coopersmith),
  • Install necessary development tools on my FreeBSD machine:
    portinstall py-openssl py-cherrypy intltool gnome-doc
    portinstall py25-mako py25-simplejson
  • Run gmake. At this point I realised that the makefile has some bugs leading to No rule to make target `help/C/package-manager.xml' — there is some semi-complicated implicit target stuff going on in there. I built the needed files by hand with
    cd gui
    msgfmt -o help/C/C.mo help/C/C.po
    xml2po -t help/C/C.mo -o help/C/package-manager.xml help/C/package-manager.xml.in

    and stripped out the locales that I didn’t need.
  • Build extract_hostid by hand with
    cd util/misc
    cc extract_hostid.c -o extract_hostid -L/usr/local/lib -R/usr/local/lib -lintl

    This is needed because gettext lives in /usr/local on FreeBSD and the Makefile (obviously) assumes the OSOL setup.
  • Fix the top-level makefile to use python from /usr/local/bin and run gmake again.
  • Patch setup.py to support FreeBSD — there looks to be support for sunos, linux, windows and darwin already (!?). This takes a patch at lines 102 and 194. Run gmake again.
  • Run gmake install
  • Finally,
    cd gate/proto/root_freebsd_amd64
    PYTHONPATH=`pwd`/usr/lib/python2.4/vendor-packages:
    sh usr/lib/pkg.depotd -d ~/pkg/ips -p 10000

And there you have it. FreeBSD machines serving up OpenSolaris packages. For the purposes of KDE4 on Solaris, this means that pkg.bionicmutton will soon be returning and you will be able to install IPS packages instead of having to compile everything from source yourself. YMMV, contains rabid weasels, etc.

[[ And, as an addendum, let me congratulate the pkg / IPS team on the enormous improvement in the packaging system between OSOL 2008.11 and 2009.6; it used to be “ugh” and has reached, for me at least, a level of “hey, that’s pretty neat.” ]]

Debugging hardware

Tuesday, June 9th, 2009

I mentioned that I had one machine that wouldn’t run OSOL 2009.6, so I did a little hardware debugging on it. Step 1 is, of course, removing all the bits that aren’t essential. So I stripped the machine down to two sticks of RAM on the motherboard and a CD drive, and lo! It booted and behaved normally. Step 2 was adding back RAM until it was full again (at 6GB). The reason to strip RAM at all is that I’ve had bad sticks in the past and some OSsen are picky, so I figured stripping down was possibly useful. And, having repopulated the RAM banks, things were still OK. So it must have been one of the two extra NICs — an RTL8100 and an RTL8169 — in the PCI slots. Left them out for now, since the on-board NIC was now supported. Chalk up another problem worked-around and not really fixed.

I have a stupid early train to Germany tomorrow, so just a few short notes on KDE 4.3-beta on OSOL: all graphicsview things seem broken. Using the raster renderer doesn’t help. Icons are not found — at least not the oxygen ones that I thought would come with kdebase-runtime, but I can use the Tango ones installed on the system. Plasma crashes. Even konsole crashes a while after plasma and kwin do. KMail is busy pulling in 500MB of disconnected IMAP, nothing bad happened there yet.

While pstack(1) would have been an obvious tool to apply to any of the crashes, that’s something to look at some other day. For now, it’s enough to have finished round one of compiling. [[ At GCDS there will be lots more time for polishing, for sure. ]]

Some notes on OpenSolaris 2009.6

Monday, June 8th, 2009

As I mentioned previously, I’ve updated some of my machines to OSOL 2009.6. Only some, though. It refuses to start on my AMD 760G based machine — gets through grub, shows the splash and then hangs. I haven’t sat down to debug that one. It does run quite nicely on my new laptop (some folks asked: it’s an MSI GX620, which is nominally a “gamer’s laptop”. Poor battery life, but I realised that I don’t use the lappy in planes and on the longer train trips I have there’s power. GF9600, P8600, 4GB, 320GB — it’s slightly more powerful than my desktop machines. The keyboard is OK, takes a little getting used to because some of the punctuation is slightly smaller than usual. It has a numeric keypad, which as far as I’m concerned could have been left out for some bigger keys. The machine is a little noisier than I might have wanted, too.). There’s also a really nice VirtualBox image for OSOL 2009.6. That one only gets you a text login, though.

On the KDE4 front on OSOL, we hit some issues similar to what happens on Windows (and to a lesser degree on FreeBSD). I’ll quote Christian Ehrlicher from his recent “stopping Windows development” blog entry:

Another problem I’ve is that I could not fix bugs in kdelibs just because the dependencies are moving fast and since we have to take care for all system libs (png, xml, openssl, pcre, …). Making sure that they’re up-to-date can take a lot of time. And when I finally managed to compile a KDE program I hit compiler errors. This is all fixable but not when you’ve only a little amount of time for KDE development.

Pavel has been hacking up a veritable storm in updating our KDE builds to 4.3-beta, and I’ve been following along trying to update dependencies (gpg and friends, lzo, akonadi, …) as well. And then we hit new dependencies (what is openslp?) all of a sudden, which means backtracking and packaging some other bit of Free Software first.

All this hacking happens in the publicly writable kde4-specs-42 Mercurial repository on solaris.bionicmutton.org. It’s publicly writable so that anyone can contribute, but that also means that it sometimes gets screwed up. By me, for instance, because I pushed something last week that broke all 64-bit builds in it. Gah. Anyway, that’s cleaned up now, and the current status is: KDE 4.3-beta builds, runs ok (oh joy of konsole compared to GNOME-terminal). In VirtualBox there are rendering issues. Those will go away in time, I imagine, or once someone fiddles around with default themes and such.

My intention is to release a OSOL 2009.6 KDE 4.3-beta VirtualBox appliance as soon as I have something that works “well enough.” That means that you need to be able to log in from a display manager (gdm or kdm, I don’t care — I haven’t gotten either of them to start up properly on the appliance yet) and get at least the KDE Plasma Desktop functionality, with Konqueror and Konsole as applications. That would make me happy, as a new milestone in the ever-shifting race to keep up with KDE development.