Bobulate


Archive for June, 2009

Template function parameters

Monday, June 15th, 2009

Started pushing patches produced in packaging KDE 4.3-beta2 on OpenSolaris to the KDE SVN repo yesterday. Started on some safe targets, like KPilot, where I know the codebase pretty well. One of my patches was “add a newline at end of file”, which is one of those kind-of-dorky fixes that sometimes needs to happen.

There was one interesting patch, though, which again illustrates how gcc stretches (or violates) the C++ standard for convenience. The issue if templates that take function parameters, like this one:
template <int (*f)(int)> class A { } ;
This is a template you can instantiate with a single function that takes an int, returning int. For instance, close() would fit the bill. Or would it? In C++, functions also have linkage — this could be either C++ linkage (default) or extern “C” linkage. In the example above, f() has C++ linkage. This means that you can’t use close() as a parameter to this template, because close() has C linkage.

With gcc you can. It glosses over this difference, but standard-conforming compilers will refuse to compile this. It’s apparently a confusing and contentious issue, given the number of blog entries (like this one) and forum posts dedicated to it. If it had been just KPilot, I suppose I would have just committed stuff and moved on, but template function parameters show up in Akonadi as well, so I suspect they will get more use as time goes on.

The point is you can’t write
template <extern "C" int (*f)(int)> class A { } ;
to specify the linkage of template parameter f. There are apparently two ways of working around this. One is to use a typedef to introduce the linkage into the function type, like so
extern "C" typedef int (*cfunc)(int);
template <cfunc foo> class A { } ;

The other is to introduce an intermediate function with C++ linkage (as expected by the original template) that calls the intended target function with C linkage, like so:
int _close(int fd) { return close(fd); }
// ..
A<_close> a(0);

Here we introduce an intermediate function _close that just calls the original function; the only difference is the linkage of the two. Making _close static inline should keep the overall impact low (I haven’t investigated).

Which approach you choose depends on how much control you have over the template definition. For templates defined by an external library, only the latter might be possible.

The Poisoned Web?

Monday, June 15th, 2009

I was reading some article on the Register just now and one of the doubleclick ads was for OpenOffice. “Gosh, Free Software really has gone mainstream”, I thought, although the domain name of the target was a little odd; I clicked on it anyway (wear rubbers!) and ended up on www dot openoffice dash dash plus dot info slash nl slash and .. whoa. That’s a fascinating way of getting people to download a very big, presumably very poisonous, .exe file. It’s a good thing that the site is a mish-mash of Dutch and Spanish, as that might tip off some potential victims that this is not entirely kosher.

The Open Web

Monday, June 15th, 2009

The Open Web is a mix of technologies and concepts. Open web technologies and protocols (HTTP, HTML, CSS, Javascript, …) have carried the World Wide Web to success by implementing decentralization, transparency, extendibility, third party innovation, bidirectional communications and end-user usability and integration.

The NLUUG fall conference (expect around 280 attendees in Ede, the Netherlands) is on the Open Web this time around; if you’re working on web standards, web tools, interoperability or anything else covered by the call for abstracts (PDF, 817k, low information density), do drop the programme committee a note (and an abstract of around 200 words). Even policy wonks can have their say there. Deadline for submissions is the end of this month.

Depeche Mode on static variables in shared libraries

Sunday, June 14th, 2009

Corrupt
$ nepomuk-rcgen
Segmentation Fault (core dumped)

In Chains
$ CC -V
CC: Sun Ceres C++ 5.10 SunOS_i386 2009/03/06

Wrong
static QMutex s_mutex;

Fragile Tension
target_link_libraries(nepomuk-rcgen
- ${SOPRANO_LIBRARIES}
${QT_QTCORE_LIBRARY}
${QT_QTDBUS_LIBRARY}
+ ${SOPRANO_LIBRARIES}

Perfect
$ nepomuk-rcgen
Usage: ...

(and I don’t even particularly like Depeche Mode, but reading Sebas’ plans made me give it a try. And I’m kind of hoping Thiago will point out “The truth is …”)

What’s the worst that could happen?

Sunday, June 14th, 2009

I wrote my previous bit on KDE4 OpenSolaris packages before all of them were done; I scheduled the post and left the packages building and went off to teach classes (networking and SSL technologies for high school kids). What’s the worst that could happen?

Well, for one thing a dependency on a non-released version of Soprano could show up — one that isn’t tested for in the cmake code either. I can understand this to some extent, after all there is a great deal of co-development going on between soprano and its clients. Fortunately, soprano has been tarballed since then and added to the KDE source download mirrors. I see on the KDE release management mailing list that there’s a couple more gotchas, but those will only show up once I get past KDEbase and pim.

The life-size stopper for KDE PIM on OpenSolaris right now is the behavior of nepomuk-rcgen, which just segfaults on startup. That effectively prevents KDE PIM from building, since it uses the latest nepomuk cmake magic. On the other hand, things like konsole do work.

One commenter was asking about sourceJuicer and p5i files. The p5i files are a new thing in pkg since OSOL 2009.6 — I don’t know if pkgtool sets them up correctly, or even who is responsible for making these things or where it is supposed to get the publisher information from anyway. Any hints on what should go in the specfiles to produce useful p5i files would be appreciated. As for jucr, these packages aren’t yet in any shape to push there — and there are various problems related to build-time dependencies as yet unresolved. For instance, you can’t depend on cmake as a build tool on the jucr build machines. Some of the KDE4 dependencies have been pushed there, but they are not building regularly. It’s easier to do so outside of jucr, where updates can be done faster (and of course, the spec file repo if you want to build everything yourself is on bionicmutton, as documented on techbase).

PS. Konqueror + certain proprietary plugins needed for popular culture (viz. YouTube) works, too. I’m impressed (by the Konqueror developers).

PPS. You can now install KDEgdm-integration normally with “pkg install”. Note that the fonts for Qt applications are butt-ugly; run Qt config and set up nicer ones, then run systemsettings to fix up KDE’s appearance. At some point I hope to figure out how to set better defaults for both on package install.

Itty-bitty note on IPS packages

Friday, June 12th, 2009

The KDE4 IPS packages that I’ve been working on have been rolled out at bionicmutton so if you are a brave soul, you could install them by adding that as a package publisher (pkg.bionicmutton.org points to the same). Something like:
pfexec pkg set-authority -O http://solaris.bionicmutton.org:10000/ kdeips
pfexec pkg install KDEgdm-integration

This will get you an /opt/foss and a /opt/kde-4.2 (the latter actually contains KDE 4.3-beta2). And you should be able to run bits and pieces from there, or even log in to KDE from the regular OpenSolaris login manager.

Getting the packages out there did have some issues. These are entirely networking related: I build them at home on one of my OSOL machines, then push them out into the IPS package server, which is running on a FreeBSD machine at the university. Since my DSL upload speed is low (about 50% faster than Jon) uploads of large files time out regularly. Pushing Qt or Boost is pretty much a no-go, as it will fall over after, say, 90MB or more. I suppose I could get on my bicycle and carry my laptop to a place where I could use the direct wired network to the server, but that would mean going outside.

Instead, I found — relatively well-hidden, which I find typical of Sun documentation — a wiki page that provides a nice-to-read HOWTO package which in turn points to a download page where you can get tools. The universal toolkit image is sufficient, and it contains pkg-toolkit/pkg/bin/archivepkgs.py, which will extract a package from a local repository and dump it to a tarball. In essence it defines an on-disk format for moving packages around, and you can move the tarball to another IPS repo, unpack it, and it can be served from there.

This means that I can build, publish to a repository on localhost (no network timeouts!), dump to a tarball, scp to the actual server (scp is a lot more robust in that sense than python scripts doing web-services over slow links), unpack, and there it is. It’s still a 40 minute upload of Qt, but knowing that it will succeed is important.

Thanks to Kohsuke Kawaguchi for writing the original tarpkgs.py script and documenting it on his blog.

KDE Forum – Congratulations

Thursday, June 11th, 2009

Some months back, the KDE forums were re-launched under a community banner (they had previously been run by a third party), with a new and enthusiastic admin team. One of the first things the team achieved was the release of the forum software under a Free Software license (it was previously proprietary). And they haven’t sat still since.

One of the signs of the professionalism of the forum admin team (professionalism in the sense of upright behavior, reporting, being responsive end friendly and having a clear sense of purpose even in a volunteer organization) is the forum staff blog which posts staff goings-on or highlights particular topics. It’s missing the notice that one of the team needs to leave (for personal reasons), so I’d like to take this opportunity to thank Rob L. particularly for his work in re-invogorating one of the communication channels KDE users have amongst themselves and with developers.

Personally I’ve never liked forums. I remember using the phrase “loathe and abhor” in some discussions around running a KDE forum at all, but that doesn’t take away the fact that many people do like using forum software for discussions (instead of mailing lists or usenet). But one of the things the team has done really well is to make the forum useful even to grumpy old farts such as myself, by managing the forum and reporting on the results of forum discussions — that’s professionalism again.

A cool feature that results from this is the KDE brainstorm (announcement on the dot). Here’s a place to hammer away at ideas and mold them into something sufficiently concrete that it makes sense to push into KDE’s bug tracker as a wishlist item. Bug trackers are not a good place for conversation, after all. By consolidating “discussion for new features” in that place, we can add structure — and avoid having the same discussion elsewhere, like in response to release announcements (all of mark’s wishes on that page make sense in the brainstorm or as specific wishlist items in the bug tracker, but they’ll get lost as comments to a news item).

Best of all, there’s a monthly summary which adds review, statistics, triaging and some meta-comments to the results of a month’s brainstorm: this is a really good way to “listen to the people” (for developers) without getting bogged down in discussions. In a sense we are reaching (for) the point where we have a sales dude (e.g. Sayak) who is canvassing the customers for features and returns to the product development team and says “we should do this, because people are asking for it.” Then we follow the still traditional Open Source development model to see if it actually gets done, but the difference is in the channels used to get ideas.

:So congratulations, forum team, on building up something vibrant and useful.

[[ One last thing, though: good ideas are in far more plentiful supply than time and developers and artists to implement them; twenty ten-minute jobs will, in spite of the naive math, chew up a whole week of developer time. There comes a point when even the best of ideas will get stalled for “no time” — so roll up your sleeves and exercise your Free Software rights (for instance as granted by section 2 of the GPLv2). ]]

O noes! My business model!

Thursday, June 11th, 2009

Spotted a peculiar and tendentious article on WebWereld (Dutch only), which suggests that Karl-Heinz Streibich of Software AG said that the (proprietary) software industry needs direct financial support. There’s no direct quote, though. The closest thing to a direct quote is the call for a “clear strategy” in Europe to strengthen the software industry.

I’m sure the Free Software world can come up with some good suggestions there. Skill building? Local autonomy? Escape from vendor lock in? All solutions for strengthening the European software industry at all levels and obtaining better software and promoting cooperation between public sector organizations.

I have not seen this mentioned elsewhere, so it may just be lousy sensationalist reporting.

Now, there is an EU software strategy being formulated. It tries to bring together the viewpoints of various parties on what to do with software. LinuxJournal does a nice piece on it, highlighting the tensions between the EU itself, real Free Software organisations like the FSFE, and proprietary software companies.

IPS on FreeBSD

Thursday, June 11th, 2009

Right. So here’s a foolish plan: I want to publish some OpenSolaris packages to the world. They are not ready for SourceJuicer (submissions there are slightly complicated, and the forest of KDE packages I’m producing is tedious and time-consuming to import there). Last time I tried that publishing thing, I did it on a package server on an OSOL machine at the end of my own DSL line, which pretty much immediately spanked my bandwidth completely. So I want to do it on a machine with better connections.

It may surprise some that all the real servers I have run FreeBSD. I use OSOL as my main desktop operating system. There must be historical reasons for this. In any case, that means building the OpenSolaris IPS package server pkg.depotd on FreeBSD. It turns out to be relatively straightforward, as most of the actual server is written in Python (yay!) and there’s only a few bits of C in there. So my approach was:

  • Fetch the sources for pkg as documented on the pkg project page (thank you Alan Coopersmith),
  • Install necessary development tools on my FreeBSD machine:
    portinstall py-openssl py-cherrypy intltool gnome-doc
    portinstall py25-mako py25-simplejson
  • Run gmake. At this point I realised that the makefile has some bugs leading to No rule to make target `help/C/package-manager.xml' — there is some semi-complicated implicit target stuff going on in there. I built the needed files by hand with
    cd gui
    msgfmt -o help/C/C.mo help/C/C.po
    xml2po -t help/C/C.mo -o help/C/package-manager.xml help/C/package-manager.xml.in

    and stripped out the locales that I didn’t need.
  • Build extract_hostid by hand with
    cd util/misc
    cc extract_hostid.c -o extract_hostid -L/usr/local/lib -R/usr/local/lib -lintl

    This is needed because gettext lives in /usr/local on FreeBSD and the Makefile (obviously) assumes the OSOL setup.
  • Fix the top-level makefile to use python from /usr/local/bin and run gmake again.
  • Patch setup.py to support FreeBSD — there looks to be support for sunos, linux, windows and darwin already (!?). This takes a patch at lines 102 and 194. Run gmake again.
  • Run gmake install
  • Finally,
    cd gate/proto/root_freebsd_amd64
    PYTHONPATH=`pwd`/usr/lib/python2.4/vendor-packages:
    sh usr/lib/pkg.depotd -d ~/pkg/ips -p 10000

And there you have it. FreeBSD machines serving up OpenSolaris packages. For the purposes of KDE4 on Solaris, this means that pkg.bionicmutton will soon be returning and you will be able to install IPS packages instead of having to compile everything from source yourself. YMMV, contains rabid weasels, etc.

[[ And, as an addendum, let me congratulate the pkg / IPS team on the enormous improvement in the packaging system between OSOL 2008.11 and 2009.6; it used to be “ugh” and has reached, for me at least, a level of “hey, that’s pretty neat.” ]]

Travel and To-Do

Wednesday, June 10th, 2009

Spent yesterday in Germany. The usual applies: nice train ride and for once the ICE from Arnhem wasn’t horrifically late or broken down (for some reason the ICE in the Netherlands and on the stretch to Oberhausen is unreliable, but after that very good). Battery life of laptop pretty much as expected and published: a little over 2 hours. That’s fine for the purposes for which I bought it. Ridiculously pleased about German food. For some reason I nearly always leave a Dutch restaurant feeling like I got ripped off, while schweinhaxe (pork hock) and beer (I didn’t count, but it was tasty) seemed like an excellent deal. Thought a little about a To-Do list based on the hacking on KDE 4.3 that I’ve done recently — very often patches get delayed and then blocked because of freezes and then bumped to the next cycle and delayed again .. it’s a maintainence nightmare when fixes are not sent upstream (e.g. to KDE SVN).

  • File bug report for CMake’s FindBoost. Attach patch to that bug report.
  • Fix up KPilot’s akonadi resources wrt. Boost includes.
  • Figure out how to package soprano — I’m told that KDE 4.3 beta relies on an unreleased Soprano version. Guys, that just makes it more difficult to build, package and test stuff.
  • Write a spec file / package for the oxygen icons so that KDE looks less empty — or less tango-y, as the case may be. Thanks to sebas, nuno and other commenters who pointed out that they have moved.
  • Finally merge in the kpci nested-anonymous-union changes.
  • Try to change qstringmatcher.h so it doesn’t define a type in an anonymous union — this isn’t critical, but it’s an annoying warning to get for each and every file that is compiled.
  • And dozens more patches to upstream, but these are the ones that bother me most.