Paul Boddie's Free Software-related blog


Archive for the ‘Free Software’ Category

Packaging Kolab for Debian using pbuilder

Friday, November 15th, 2013

My recent excursion into Debian packaging with Kolab has involved a tour of lots of different tools and services, but it started out with a brief attempt to build the existing packages with pbuilder: a tool that has become fairly familiar to me in the process of experimenting with Debian packages and even contributing one to “Debian proper”. After some initial frustrations that prevented me from building packages using my normal workflow, I decided to familiarise myself with the infrastructure that the Kolab project itself uses to make packages, if only to reassure myself that the packages really could be built and didn’t require any special magic to do so. I won’t go into this because Timotheus has already done so in sufficient depth.

After some playing around with osc and the Development project in the Kolab OBS (Open Build Service), building packages, installing them in Debian root filesystems and User Mode Linux instances (administered by some scripts I’ve developed over the years), I persuaded myself to have another go at feeding the packages to pbuilder via the pdebuild tool, determined to overcome any build issues and to demonstrate that it could be done. One fairly good reason for doing this is that even though pbuilder-based builds can be sluggish as pbuilder decides to unpack an existing filesystem, install build dependencies, and do other housekeeping before actually building the package, it seems to be more efficient and quicker than osc, which I found took as long as 9 minutes before it was ready to build a relatively straightforward Python-based package. Another reason for going with pbuilder is that it is what the Debian project itself will be using, more or less, if – hopefully when – the packages get accepted back into Debian, whereas the OBS infrastructure seems to be based on other technologies for the management of the build environment.

Packages of the Future

One challenge posed by Kolab is that some of its packages depend on some of the other Kolab-provided packages when being built – those other packages are so-called “build dependencies” – but tools like pdebuild rely on such build dependencies already being available as installable packages from the Debian archive. For things like make and gcc, this isn’t a problem at all: they were packaged a very long time ago (although I suppose you could get caught out with very recent versions of such packages). But when a yet-to-be-submitted package is required as a build dependency for another package, the issue of satisfying that dependency arises, and this was something I hadn’t encountered before.

A perusal of the Debian Wiki provided a solution: after building the packages that the other ones rely upon, put them in a directory, expose them in a repository, and then make pbuilder consult that repository when deciding how it can satisfy those build dependencies. This involves three things:

  • A directory with newly built packages, obviously, together with the necessary package metadata.
  • A hook script that will refresh the metadata and perform the necessary update action that lets the pbuilder environment know about the packages.
  • A special configuration for pbuilder.

The Package Directory

At first, this will contain nothing at all, but as you add packages it will contain both .deb package files and the package metadata. We will call this directory deps.

The Hook Script

As the Debian Wiki page describes, a hook called D05deps can be placed in a directory called, say, hooks and populated with the following code:

#!/bin/sh
(cd /path-to-kolab-packaging/deps; apt-ftparchive packages . > Packages)
apt-get update

I did find that the permissions on the hook file were crucial, fixing them as follows:

chmod a+x hooks/D05deps

Otherwise, no mention of the script will be made in the copious output from pbuilder and it will simply be ignored. I also experienced some initial problems with the package metadata, but running the first line of actual commands from the script and manually producing the metadata was enough to get pbuilder on the right track:

cd deps; apt-ftparchive packages . > Packages

After that, it didn’t have any difficulties seeing the new packages as I added them to the deps directory.

The Configuration File

All this has to do is to point to the deps and hooks directories. You can more or less copy the contents of /etc/pbuilderrc and add the following to it (customising it for your own choices, of course):

OTHERMIRROR="deb [trusted=yes] file:///path-to-kolab-packaging/deps ./"
BINDMOUNTS="/path-to-kolab-packaging/deps"
HOOKDIR="/path-to-kolab-packaging/hooks"
EXTRAPACKAGES="apt-utils"

Yes, there really isn’t anything different about this than the example on the Debian Wiki page. I put this configuration file alongside the deps and hooks directories and called it pbuilderrc.

Actually Building Packages

With the above extra stuff in place, the process of building packages is slightly different: you have to tell pbuilder to use this alternative configuration, and then you just hope that all the different aspects of it are consistent and that pbuilder is able to take notice of it. The command that will eventually be run inside a directory containing “Debianized” sources is the following:

pdebuild -- --distribution wheezy --override-config --configfile ../pbuilderrc

Obviously, the pbuilderrc file resides in the parent directory after you change into a package’s sources directory.

Build Order

Above, I mentioned that some packages depend on others in order to be built. Finding out which packages are affected involves consulting their build dependencies which are conveniently listed in their .dsc files. Doing the following permits a general overview to be obtained and the basis of a suitable build order to be worked out:

grep ^Build-Depends *.dsc

Obviously, it makes sense to start with packages that do not depend on others that are also being built for this exercise. Devising or discovering an automated approach for this is left as an exercise for the reader, but Kolab is relatively uncomplicated and I used the following build order:

python-icalendar pykolab libkolabxml libcalendaring libkolab kolab kolab-freebusy kolab-schema kolab-syncroton kolab-utils kolab-webadmin mozilla-ldap-sdk chwala irony pyasn1-modules php-http-request2 roundcubemail roundcubemail-plugin-contextmenu roundcubemail-plugin-dblog roundcubemail-plugin-threadingasdefault roundcubemail-plugins-kolab smarty3

The General Workflow

To obtain, unpack and build the packages I used the following workflow:

  1. Visit the package downloads page (found via the Development project’s repository overview) and obtain a list of package URLs.
  2. Get each package using dget from the devscripts package. This will probably give a warning about unsigned or unverifiable packages and not unpack the sources. (I suppose that importing the repository key using apt-key fixes this. One should obviously consider the risks and recommendations around downloading code from the Internet.)
  3. Unpack each package using dpkg-source. For example:
    dpkg-source -x python-icalendar_3.4-1.dsc
  4. Change into the source directory.
  5. Run the pdebuild command given above.
  6. Copy or move the resulting package files from /var/cache/pbuilder/result into the deps directory.
  7. Optional but tidy: move the other artefacts of building from the parent directory into some other place for future reference.

There are probably much more efficient and cleaner ways of building lots of packages, but this allowed me to inspect them and to consider a few changes.

Making Changes

I took the opportunity to make some changes to the python-icalendar package because I saw that it wanted me to install python-setuptools before pdebuild would even launch pbuilder. I have little confidence in setuptools generally and would prefer not to have it on my system, and it is an unfortunate but recurring phenomenon that one finds the setup.py script commonly used for the preparation or installation of Python software packages using setuptools when its functionality only requires the older and less disruptive distutils library. Regardless of whether such changes are desired in the eventual Kolab packaging, I took the opportunity to investigate how changes should be made to the package.

Modern Debian packages prefer such “upstream patching” – where the code being changed originates with the actual developers of the software, as opposed to people packaging it for different distributions – to be done using a tool called quilt. I have some experience using quilt for my previous packaging work, but it’s easy to forget how to use it. Fortunately, the Debian Wiki came to the rescue once again. Here’s what I did in the sources directory for the package:

quilt new setup.py.patch # tell quilt about my patch
quilt add setup.py # tell quilt that I will patch setup.py
# Now, I edited setup.py to replace setuptools with distutils.
quilt refresh # update the patch within quilt
quilt pop -a # go back to the way everything was (but remember the patch for later)

When running pdebuild, the tool will notice such patches and apply them. Upon finishing they should be provided in a file containing all the necessary patches that allow the software to be built as a package. In this case, a file called python-icalendar_3.4-1.debian.tar.gz was produced.

One or two things are probably necessary to make this work:

  • A suitable quilt configuration as described on the Debian Wiki page referenced above.
  • A suitable debian/source/format file containing the 3.0 (quilt) value.

What Next?

Some of the previously encountered pitfalls had very little to do with the actual packaging of Kolab in terms of getting the software installed, but were more to do with the way it behaved once configured (and perhaps how the configuration gets done). I intend to look a bit more closely at the configuration process and to see if some of the awkward situations that may arise can’t be diagnosed and remedied by some helpful enhancements to the tools. On the way, I expect to find areas of improvement in the ways some things are done – that’s just the way things are with software – but with regard to the packaging itself most of the hard work has already been done and it seems to hold up rather well. So thanks are obviously due to Paul and Jeroen (and others) for allowing me to join in at this fairly late stage in the game.

Notions of Progress on the Free Software Desktop

Thursday, November 7th, 2013

Once again, discussion about Free Software communities is somewhat derailed by reflections on the state of the Free Software desktop. To be fair to participants in the discussion, the original observations about communities were so unspecific that people would naturally wonder which communities were being referenced.

Usability and Accessibility

As always, frustrating elements of recent Free Software desktop environments were brought up for criticism and evaluation. One of them concerned the “plasmoid” enhancements of KDE 4 (or KDE Plasma Desktop as it is known according to the rebranding of KDE assets) which are often regarded as superfluous distractions from the work of perfecting the classic desktop environment. Amidst all this, the “folder view” plasmoid (or desktop widget) in particular came under scrutiny. As I understand it, the “folder view” is just a panel or window that groups icons in a region on the desktop background, and I acknowledged that it certainly represents an improvement over managing icons on a normal desktop, but that it can also confuse people when they accidentally close the folder view – easy to do with a stray mouse click – leaving them to wonder where their icons went.

Such matters of usability make me wonder how well tested some of the concepts employed in these environments really are, despite insistences that usability experts have been involved and that non-experts in the field of usability are unable to see the forest for the trees. From my own experiences, I feel that the developers would really benefit from doing phone support for their wares, especially with users who haven’t learned all the fancy terminology and so must describe what they see from first principles and be told what to do at a similar level. Even better: such support should be undertaken from memory and not sitting in front of a similarly configured computer.

Although it is a somewhat separate discipline with different constraints, I also suspect that such “over the phone” exercises might help accessibility as well. An inexperienced user may provide different information to that provided by something like a screen reader, where the former may struggle to articulate concepts and the latter may merely describe the environment according to prescribed terms, and where the former may be able to use more flexible powers of description whereas the latter can only rely on the cooperation of other programs to populate a simplistic description of the state of the environment, but the exercise of being a person cut off from the rich graphical scenery and their familiar interaction mechanisms might put the usability and accessibility of the software into perspective for the developers.

The Measure of Progress

But back to the Free Software desktop in general, if only to contemplate notions of progress and to consider whether lessons really have been learned, or whether people would rather not think about the things which went wrong, labelling them as “finished business” or “water under the bridge” and urging people not to bring such matters up again. One participant remarked about how it took six years from 2005 to 2011 for KDE 4 to become as usable as its predecessor. A response to this indicated that this was actually “fantastic” progress given that Google used as much time to make Android “decent”.

Fantastic it may be, but we should not consider the endeavour as a software development project in isolation, with the measure of success being that something was created from nothing in six short years. Indeed, we must consider what was already there – absolutely not nothing – and how the result of the development measures up against that earlier system. As far as getting Free Software in front of people and building on earlier achievements are concerned, those six years can almost be considered six lost years. Nobody should be patting themself on the back upon hearing that someone in 2013 can move from KDE 3 to KDE 4 and feel that at least they didn’t lose much functionality.

The Role of Applications

It was also noted that KDE development now focuses more on application development than on the environment itself. One must therefore ask where we are with regard to parity with the suite of applications running under KDE 3. Here, I can only describe my own experiences, but this should be flattering to any constrained selection of updated applications because of my own rather conservative application choices.

Kontact is usable because I imagine various companies needed it to be usable to stay in business (and even then I don’t know the story of the diversions via Nepomuk and other PIM initiatives that could have endangered that application’s viability); Digikam is usable because the developers remained interested in improving the software and even maintained the KDE 3 version for a while; Okular has picked up where KPDF left off; K3B still works much the same as before. There are presumably regressions and improvements in all these: Kontact, for instance, is much slower in certain areas such as message sorting, but it probably has more robust and coherent PGP and S/MIME support than its predecessor (which may have been suffering from lack of maintenance at both the project and distribution level).

Meanwhile, Amarok has become a disaster with an incoherent interface involving lots of “in the know” controls, and after it stopped playback mid-track for the nth time and needed a complete restart to get sound back, I switched to Minirok out of desperation. Other applications took a permanent holiday, such as Kopete which I don’t miss because my IRC needs are covered by Konversation.

Stuff like Konqueror is still around, despite being under threat of complete replacement by Dolphin, although it has picked up the little “+” and “-” controls that pervade KDE now. Such controls confuse various classes of user through poor visual contrast (a tiny symbol in red or green superimposed on a multicolour icon!) while demanding from such users better than average motor skills (“to open the document aim at the tiny area but not the tiny area within the tiny area”).

Change You Can Believe In?

You wouldn’t think that I appreciate the work done on the Free Software desktop, but I do. What frustrates me and a lot of other people, however, is the way that things that should have been “behind the scenes” infrastructure improvements (Qt 3 being superseded by Qt 4, for instance) that could have been undertaken whilst preserving continuity for users have instead been thrust at those users in the form of unnecessary decisions about which functionality they can afford to lose in order to have a supported and secure system that will not gradually fall apart over time. (Not that KDE is unique in this respect, consider the Python 2 to Python 3 transition and the disruption even such a leisurely transition can cause.)

Exposing change to a group of people creates costs for those people, and when those people have other things than computing as the principal focus in their lives, such change can have damaging effects on their continued use of the software and on the quality of their lives. Following the latest trends, discovering the newest software, or just discovering how their existing software functions since the last vendor-initiated update are all distractions for people who just want to sit down, do some things on the computer, and then go back to their lives. In today’s gadget-pushing society, the productivity benefits of personal computing are being eroded by a fanaticism for showing off new and different things mostly for the sake of them being, well, new and different. Bored children may enjoy the fire-hose of new “apps”, tricks and gadgets, but that shouldn’t mean that everybody else has to be made to enjoy it as well or be considered backward “technophobes” who “don’t understand” or “won’t embrace” new technology.

One can argue that by failing to shield users from the cost of change, especially when the level of functionality remains largely similar, Free Software desktop developers have imperilled their own mission with the result that they now have to make up lost ground in the struggle to get people to use their software. But even to those developers who don’t care about such things, the other criticism that could be levelled against them might be a more delicate matter and more difficult to reconcile with their technical reputation: churning up change and making others deal with it can arguably be regarded as bad software project management and, indeed, bad project management in general.

Maybe such considerations also have something to say about the direction any given community might choose to follow, and whether bold new ideas should be embraced without a thorough consideration of the consequences.

Neo900: And they’re off!

Friday, November 1st, 2013

Having mentioned the Neo900 smartphone initiative previously, it seems pertinent to note that it has moved beyond the discussion phase and into the fundraising phase. Compared to the Ubuntu Edge, the goals are fairly modest – 25000 euros versus tens of millions of dollars – but the way this money will be spent has been explained in somewhat more detail than appeared to be the case for the Ubuntu Edge. Indeed, the Neo900 initiative has released a feasibility study document describing the challenges confronting the project: it contains a lot more detail than the typical “we might experience some setbacks” disclaimer on the average Kickstarter campaign page.

It’s also worth noting that as the Neo900 inherits a lot from the GTA04, as the title of the feasibility study document indicates when it refers to the device as the “GTA04b7”, and as the work is likely to be done largely within the auspices of the existing GTA04 endeavour, the fundraising is being done by Golden Delicious (the originators of the GTA04) themselves. From reading the preceding discussion around the project, popular fundraising sites appear to have conditions or restrictions that did not appeal to the project participants: Kickstarter has geographical limitations (coincidentally involving the signatory nations of the increasingly notorious UKUSA Agreement), and most fundraising sites also take a share of the raised funds. Such trade-offs may make sense for campaigns wanting to reach a large audience (and who know how to promote themselves to get prominence on such sites), but if you know who your audience is and how to reach them, and if you already have a functioning business, it could make sense to cut the big campaign sites out of the loop.

It will certainly be interesting to see what happens next. An Openmoko successor coming to the rescue of a product made by the mobile industry’s previously most dominant force: that probably isn’t what some people expected, either at Openmoko or at that once-dominant vendor.

Dell and the Hardware Vendors Page

Thursday, October 31st, 2013

Hugo complains about Dell playing around with hardware specifications on their Ubuntu-based laptop products. (Hugo has been raising some pretty interesting issues, lately!)

I think that one reason why Dell was dropped from the Hardware Vendors page on the FSFE Fellowship Wiki was that even though Dell was promoting products with GNU/Linux pre-installed, actually finding them remained a challenge involving navigating through page after page of “Dell recommends Windows Vista/Windows 8/Windows Whatever” before either finding a low-specification and overpriced afterthought of a product or the customer just giving up on the whole idea.

Every time they “embrace Linux” I’d like to think that Dell are serious – indeed, Dell manages to support enterprise distributions of GNU/Linux on servers and workstations, so they can be serious, making their antics somewhat suspiciously incompetent at the “home and small office” level – but certainly, the issue of the changing chipset is endemic: I’m pretty sure that a laptop I had to deal with recently didn’t have the advertised chipset, and I tried as hard as possible to select the exact model variant, knowing that vendors switch things out “on the quiet” even for the same model. On that occasion, it was Lenovo playing around.

The first thing any major vendor should do to be taken seriously is to guarantee that if they sell a model with a specific model number then it has a precise and unchanging specification and that both the proper model number and the specification are publicly advertised. Only then can we rely on and verify claims of compatibility with our favourite Free Software operating systems.

Until then, I can only recommend buying a system from a retailer who will stand by their product and attempt to ensure that it will function correctly with the Free Software of your choice, not only initially but also throughout a decent guarantee period. Please help us maintain the Hardware Vendors page and to support vendors and retailers who support Free Software themselves.

(Note to potential buyers and vendors: the Hardware Vendors page does not constitute any recommendation or endorsement of products or services, nor does the absence of any vendor imply disapproval of that vendor’s products. The purpose of the page is to offer information about available products and services based on the experiences and research of wiki contributors, and as such is not a marketplace or a directory where vendors may request or demand to be represented. Indeed, the best way for a vendor to be mentioned on that page is to coherently and consistently offer products that work with Free Software and that satisfy customer needs so that someone may feel happy enough with their purchase that they want to tell other people about it. Yes, that’s good old-fashioned service being recognised and rewarded: an unusual concept in the modern world of business, I’m sure.)

The inside of some random Dell computer at a former workplace - this one may not have been running GNU/Linux, but my Dell workstation was

The inside of some random Dell computer at a former workplace - this one may not have been running GNU/Linux, but my Dell workstation was

More on Kolab and Debian

Wednesday, October 23rd, 2013

Well, after my recent blog post highlighting some surprising problems with my Kolab installation – not at all a complaint about the packages, really, but more of a contribution towards improving the packaging situation, as I see it at least – some more interest in the situation around Kolab packaging for Debian has been shown:

Packaging for Debian can be a challenge. My own experience involved a pure-Python tool and still required lots of iterations to satisfy the Debian gatekeepers; this is understandable given that they try to virtually guarantee a coherent experience and provide a large selection of software whose copyright and licensing status must be clear, acceptable and without nasty surprises. I respect the effort that has gone into Kolab packaging for Debian already: without that effort, I probably wouldn’t even have tried the software.

The plan now must surely involve input from the Debian groupware initiative, especially as the Kolab architecture presumably resembles some of the other packaged solutions, and those who have contributed to the existing packaging work, as well as some discussion on the Kolab development mailing list, and some effort with the Open Build Service tools (with the “build commander” tool fortunately being available as a Debian package).

It is unfortunate that as Torsten points out, “Currently, there’s only one volunteer working on the Debian packages in his limited spare time, but hundreds of people who want to use reliable Debian packages.” Meanwhile, Timotheus points out, “Since there seems to be no corporate funding available for the Debian packages, we all need to pull together as a community and get it done!” It seems to me that those organisations that stand to benefit from more adoption of Free Software groupware, especially those using Debian as their foundation, might do well to assist this work instead of waiting for people to get it done in their free time.

Kolab and Debian Packaging Pitfalls

Monday, October 21st, 2013

Hugo Roy has been trying to install Kolab and not getting on particularly well with it. His experiences persuaded me to take another look at my Kolab installation done back in June, and to my surprise it didn’t seem to work any more. I eventually discovered some things that will probably need fixing in the packaging, and these are mentioned below. I suppose I’ll try and pursue these with the developers and packagers.

The LDAP server (provided by the 389-ds suite of packages, but actually started when the ns-slapd program is run, and known as the dirsrv service – yes, all very confusing stuff) doesn’t want to run until the permissions are fixed on the /var/run/dirsrv and /var/lock/dirsrv directories so that the ns-slapd program can create pid and lock files.

The kolab-saslauthd service won’t be running if the LDAP server isn’t running. (You can check this using service --status-all and seeing what is running and what isn’t.) Some Kolab programs seem to get upset when they can’t connect to the LDAP or IMAP servers, and if the LDAP server is brought up, there’s a frequent, recurring error from a Python program complaining about IMAP server connections failing…

Traceback (most recent call last):
 File "/usr/lib/python2.7/dist-packages/kolabd/process.py", line 44, in synchronize
 auth.synchronize()
 File "/usr/lib/python2.7/dist-packages/pykolab/auth/__init__.py", line 243, in synchronize
 self._auth.synchronize()
 File "/usr/lib/python2.7/dist-packages/pykolab/auth/ldap/__init__.py", line 860, in synchronize
 callback=self._synchronize_callback,
 File "/usr/lib/python2.7/dist-packages/pykolab/auth/ldap/__init__.py", line 2151, in _search
 secondary_domains
 File "", line 10, in
 File "/usr/lib/python2.7/dist-packages/pykolab/auth/ldap/__init__.py", line 1895, in _persistent_search
 secondary_domains=secondary_domains
 File "/usr/lib/python2.7/dist-packages/pykolab/auth/ldap/__init__.py", line 1735, in _synchronize_callback
 eval("self._change_none_%s(entry, change_dict)" % (entry['type']))
 File "", line 1, in
 File "/usr/lib/python2.7/dist-packages/pykolab/auth/ldap/__init__.py", line 1389, in _change_none_user
 self.imap.connect(domain=self.domain)
 File "/usr/lib/python2.7/dist-packages/pykolab/imap/__init__.py", line 144, in connect
 self._imap[hostname].login(admin_login, admin_password)
 File "/usr/lib/python2.7/dist-packages/pykolab/imap/cyrus.py", line 133, in login
 cyruslib.CYRUS.login(self, *args, **kw)
 File "/usr/lib/python2.7/dist-packages/cyruslib.py", line 416, in login
 self.__doexception("LOGIN", error)
 File "/usr/lib/python2.7/dist-packages/cyruslib.py", line 359, in __doexception
 self.__doraise( function.upper(), msg )
 File "/usr/lib/python2.7/dist-packages/cyruslib.py", line 368, in __doraise
 raise CYRUSError( idError[0], mode, msg )
 CYRUSError: (10, 'LOGIN', 'generic failure')

Restarting the kolab-saslauthd service fixes this; maybe restarting the cyrus-imapd service also helps. Restarting the kolab-server service should apparently synchronise the constituent services, but I’m not sure it helps if you get the above Python error. You may also see an LDAP-related error which just appears to be the same program or a related one getting even more upset about the LDAP server.

Also, if you don’t update for a while, the clamav-freshclam service uses a lot of CPU and bandwidth performing updates. Such stuff needs turning off if you value your computer’s interactivity, in my experience.

Neo900: Combining Communities to Create Opportunities

Tuesday, September 10th, 2013

Ever since the withdrawal of Openmoko from open smartphone development, it appears to have been challenging to find large numbers of people who might be interested in supporting similar open hardware efforts, either by having them put down money to fund the development and production of devices, or by encouraging them to develop Free Software to run on the hardware produced by those efforts. That anyone can go and buy an Android phone and tell themselves that it is just like that dream they once had of running Linux on a phone (if they turn the lights down low enough and ignore the technical and ethical limitations) serves as just enough of a distraction to keep people merely curious about things like Openmoko and open hardware, persuading them to hold off supporting such things until everybody else has jumped on board and already made it a safe choice. It almost goes without saying that where risk-takers are needed to make something happen, that thing is not going to happen if everybody looks to everybody else to take the risk. (And even when people do take the risk, they seem to think that their pledges and donations are as good as money in the bank, but that is another story.)

Naturally, the Ubuntu Edge campaign showed that some money is floating around and can be attracted to suitably exciting projects. Unfortunately, one may be tempted to conclude that anything more mundane than a next generation product – one that can only be delivered at some point in the future, once it becomes feasible and economic to manufacture and sell something with “out of this world” specifications – is unlikely to attract the interest of potential customers with money to pledge towards something. Such potential customers surely want something their money cannot already buy, and offering only things like openness and freedom as enhancements to today’s specifications is perhaps not exciting enough for some of those people.

It is therefore rather refreshing that two communities have recently become more aware of the possibilities offered by, and available to, open hardware: the OpenPhoenux community with their ongoing GTA04 project to follow on from the work of Openmoko, and the Maemo community seeking a sustainable future beyond the now-discontinued Nokia N900 smartphone. Despite heroic efforts to sustain the GTA04 project, outside interest has apparently been low enough that additional production has been placed on hold: a minimum number of orders needs to exist before any kind of further manufacturing can take place. Meanwhile, a community of people whose devices may one day fail to function or perhaps no longer function already, forcing them to seek replacements in the second-hand market with all the usual online auction profiteering and the purchasing uncertainties that go along with it, have been made aware of an active hardware project whose foundations largely resemble those of the devices they wish to sustain.

So, unlike Ubuntu Edge, the Neo900 initiative is not offering next year’s hardware. In fact, it is not even offering this year’s hardware. But what it does offer is a sustainable path into the future for those who like the form factor and software provided by the N900: people who were having to come to terms with buying a device that would not be as satisfactory as the one they already have, merely because the device they already have has reached the end of its usable life, and because the mobile device industry has a different idea of progress from the one they happen to have. In effect, the Neo900 is about taking control, owning the roadmap, deciding when or whether the fads and fashions of the industry at large will serve them better, and being able to choose or to reject the wider industry’s offerings on a more reasonable timescale.

The N900, as a product abandoned some time ago by Nokia as it retreated into being a vassal state of the Microsoft empire, gets an opportunity to rise from the ashes of the ruin wrought by the establishment of that corporate relationship. At a time where Nokia sees its core business incorporated into Microsoft itself in the final chapter of what has to be one of the most widely predicted and reported acts of alleged corporate looting in recent years, and where former Nokia executives announce plans to re-establish the business independently by attracting neglected Nokia talent, the open phoenix in the form of OpenPhoenux may help the N900 to rise above its troubled past and to shine once again as its former custodians struggle with the mayhem of corporate integration or corporate reconstruction, depending on where they end up.

People might wonder why anyone would want more of the same rather than something new, different, exciting, shiny. The fact is that away from the noise of exhibition floor, trade show and developer conference demonstrations, most people just want something that works and, preferably, something they already know. Their life goes on and does not wait for them to have to learn the latest gestures and moves to make some new gadget do what their old gadget was doing before it broke down. Some people – those with an N900 or those who wanted one – now have a new opportunity available to them, thanks to open hardware and the Neo900 initiative. For the rest of us, it offers more choice and maybe some hope that open hardware will be able to cater to more people in times to come.

Terms of the Pirates

Tuesday, September 3rd, 2013

According to the privacy policy on the Web site of the UK Pirate Party, “The information generated by the cookie about your use of the website (including your IP address) will be transmitted to and stored by Google on servers in the United States.” Would it not be more appropriate for the pirates to do their own visitor analysis using a Free Software solution like Piwik?

Come on, pirates, you can do better than this!

Licensing in a Post Copyright World: Some Clarifications

Sunday, July 28th, 2013

Every now and then, someone voices their dissatisfaction with the GNU General Public License (GPL). A recent example is the oddly titled Licensing in a Post Copyright World: odd because if anything copyright is getting stronger, even though public opposition to copyright legislation and related measures is also growing. Here I present some necessary clarifications for anyone reading the above article. This is just a layman’s interpretation, not legal advice.

Licence Incompatibility

It is no secret that code licensed only under specific versions of the GPL cannot be combined with code under other specific versions of the GPL such that the resulting combination will have a coherent and valid licence. But why are the licences incompatible? Because the decision was taken to strengthen the GPL in version 3 (GPLv3), but since this means adding more conditions to the licence that were not present in version 2 (GPLv2), and since GPLv2 does not let people who are not the authors of the code involved add new conditions, the additional conditions of GPLv3 cannot be applied to the “GPLv2 only” licensed code. Meanwhile, the “GPLv3 only” licensed code requires these additional conditions and does not allow people who are not the authors of the code to strip them away to make the resulting whole distributable under GPLv2. There are ways to resolve this as I mention below.

(There apparently was an initiative to make version 2.2 of the GPL as a more incremental revision of the licence, although incorporating AGPLv3 provisions, but according to one of the central figures in the GPL drafting activity, work progressed on GPLv3 instead. I am sure some people wouldn’t have liked the GPLv2.2 anyway, as the AGPLv3 provisions seem to be one of many things they don’t like.)

Unnecessary Amendments

Why is the above explanation about licence compatibility so awkward? Because of the “only” stipulation that people put on their code, against the advice of the authors of the licence. It turns out that some people have so little trust in the organisation that wrote the licence they have nevertheless chosen to use that in a flourish of self-assertion, they needlessly stipulate “only” instead of “or any later version” and feel that they have mastered the art of licensing.

So the problems experienced by projects who put “only” everywhere, becoming “stuck” on certain GPL versions is a situation of their own making, like someone seeing a patch of wet cement and realising that their handprint can be preserved for future generations to enjoy. Other projects suffer from such distrust, too, because even if they use “or any later version” to future-proof their licensing, they can be held back by the “only” crowd if they make use of that crowd’s software, rendering the licence upgrade option ineffective.

It is somewhat difficult to make licences that request that people play fair and at the same time do not require people to actually do anything to uphold that fairness, so when those who write the licences give some advice, it is somewhat impertinent to reject that advice and then to blame those very people for one’s own mistake later on. Even people who have done the recommended thing, but who suffer from “only” proliferation amongst the things on which their code depends should be blaming the people who put “only” everywhere, not the people who happened to write the licence in the first place.

A Political Movement

The article mentions that the GPL has become a “political platform”. But the whole notion of copyleft has been political from the beginning because it is all about a social contract between the developers and the end-users: not exactly the preservation of a monopoly on a creative work that the initiators of copyright had in mind. The claim is made that Apple shuns GPLv3 because it is political. In fact, companies like Apple and Nokia chiefly avoid GPLv3 because the patent language has been firmed up and makes those companies commit to not suing recipients of the code at will. (Nokia trumpeted a patent promise at one point, as if the company was exhibiting extreme generosity, but it turned out that they were obliged to license the covered patents because of the terms of GPLv2.) Apple has arguably only accepted the GPL in the past because the company could live with the supposed inconvenience of working with a wider development community on that community’s terms. As projects like WebKit have shown, even when obliged to participate under a copyleft licence, Apple can make collaboration so awkward that some participants (such as Google) would rather cultivate their own fork than deal with Apple’s obsession to control everything.

It is claimed that “the license terms are a huge problem for companies”, giving the example of Apple wanting to lock down their products and forbid anyone from installing anything other than Apple-approved software on devices that they have paid for and have in their own possession, claiming that letting people take control of their devices would obligate manufacturers to “get rid of the devices’ security systems”. In fact, it is completely possible to give the choice to users to either live with the restrictions imposed by the vendor and be able to access whichever online “app” store is offered by that vendor, or to let those users “root” or “jailbreak” their device and to tell them that they must find other sources of software and content. Such choices do not break any security systems at all, or at least not ones that we should be caring very much about.

People like to portray the FSF as being inflexible and opposed to the interests of businesses. However, the separation of the AGPL and the GPL contradicts such convenient assertions. Meanwhile, the article seems to suggest that we should blame the GPL for Apple’s inflexibility, which is, of course, absurd.

Blaming the Messenger

The article blames the AGPLv3 for the proliferation of “open core” business models. Pointing the finger at the licence and blaming it for the phenomenon is disingenuous since one could very easily concoct a licence that requires people to choose either no-cost usage, where they must share their code, or paid usage, where they get to keep their code secret. The means by which people can impose such a choice is their ownership of the code.

Although people can enforce an “open core” model more easily using copyleft licensing as opposed to permissive licensing, this is a product of the copyright ownership or assignment regime in place for a project, not something that magically materialises because a copyleft licence was chosen. It should be remembered that copyleft licences effectively regulate and work best with projects having decentralised ownership. Indeed, people have become more aware of copyright and licensing transfers and assignments perhaps as a result of “open core” business models and centralised project ownership, and they should be distrustful of commercial entities wanting such transfers and assignments to be made, regardless of any Free Software licence chosen, because they designate a privileged status in a project. Skepticism has even been shown towards the preference that projects transfer enforcement rights, if not outright ownership, to the FSF. Such skepticism is only healthy, even if one should probably give the FSF the benefit of the doubt as to the organisation’s intentions, in contrast to some arbitrary company who may change strategy from quarter to quarter.

The article also blames the GPLv3 or the AGPLv3 for the behaviour of “licence trolls”, but this is disingenuous. If Oracle offers a product with a choice of AGPLv3 or a special commercial licence, and if as a consequence those who want permissively licensed software for use in their proprietary products cannot get such software under permissive licences, it is the not the fault of any copyleft licence for merely existing: it is the fault (if this is even a matter of blame) of those releasing the software and framing the licence choices. Again, you do not need the FSF’s copyleft licences to exist to offer customers a choice of paying money or making compromises on how they offer their own work.

Of course, if people really cared about the state of projects that have switched licences, they would step up and provide a viable fork of the code starting from a point just before the licence change, but as can often be the case with permissively licensed software and a community of users dependent on a strong vendor, most people who claim to care are really looking for someone else to do the work so that they can continue to enjoy free gifts with as few obligations attached as possible. There are permissively licensed software projects with vibrant development communities, but remaining vibrant requires people to cooperate and for ownership to be distributed, if one really values community development and is not just looking for someone with money to provide free stuff. Addressing fundamental matters of project ownership and governance will get you much further than waving a magic wand and preferring permissive licensing, because you will be affected by those former things whichever way you decide to go with the latter.

Defining the New Normal

The article refers to BusyBox being “infamous” for having its licence enforced. That is a great way of framing reasonable behaviour in such a way as to suggest that people must be perverse for wanting to stand behind the terms under which, and mechanisms through which, they contributed their effort to a project. What is perverse is choosing a licence where such terms and mechanisms are defined and then waiving the obligation to defend it: it would not only be far easier to just choose another licence instead, but it would also be more honest to everyone wanting to use that project as well as everyone contributing to the project, too. The former group would have legal clarity and not the nods and winks of the project leadership; the latter group would know not to waste their time most likely helping people make proprietary software, if that is something they object to.

Indeed, when people contribute to a project it is on the basis of the social contract of the licence. When the licence is a copyleft licence, people will care whether others uphold their obligations. Some people say that they do not want the licence enforced on a project they contribute to. They have a right to express their own preference, but they cannot speak for everyone else who contributed under the explicit social contract that is the licence. Where even one person who has a contribution to a project sees their code used against the terms of the licence, that person has the right to demand that the situation be remedied. Denying individuals such rights because “they didn’t contribute very much” or “the majority don’t want to enforce the licence” (or even claiming that people are “holding the project to ransom”) sets a dangerous precedent and risks making the licence unenforceable for such projects as well as leaving the licence itself as a worthless document that has nothing to say about the culture or functioning of the project.

Some people wonder, “Why do you care what people do with your code? You have given it away.” Firstly, you have not given it away: you have shared it with people with the expectation that they will continue to share it. Copyleft licensing is all about the rights of the end-user, not about letting people do what they want with your code so that the end-user gets a binary dropped in their lap with no way of knowing what it is, what it does, or having any way of enjoying the rights given to the people who made that binary. As smartphone purchasers are discovering, binary-only shipments lead to unsustainable computing where devices are made obsolete not by fundamental changes in technology or physical wear and tear but by the unavailability of fixed, improved or maintained software that keep such devices viable.

Agreeing on the Licence

Disregarding the incompatibility between GPL versions, as discussed above, it appears more tempting to blame the GPL for situations of GPL-incompatibility than it does to blame other licences written after GPLv2 for causing such incompatibility in the first place. The article mentions that Sun deliberately made the CDDL incompatible with the GPL, presumably because they did not want people incorporating Solaris code into the GNU or Linux projects, thus maintaining that “competitive edge”. We all know how that worked out for Solaris: it can now be considered a legacy platform like AIX, HP-UX, and IRIX. Those who like to talk up GPL incompatibilities also like to overlook the fact that GPLv3 provides additional compatibility with other licences that had not been written in a GPLv2-compatible fashion.

The article mentions MoinMoin as being affected by a need for GPLv2 compatibility amongst its dependencies. In fact, MoinMoin is licensed under the GPLv2 or any later version, so those combining MoinMoin with various Apache Software Licence 2.0 licensed dependencies could distribute the result under GPLv3 or any later version. For those projects who stipulated GPLv2 only (against better advice) or even ones who just want the choice of upgrading the licence to GPLv3 or any later version, it is claimed that projects cannot change this largely because the provenance of the code is frequently uncertain, but the Mercurial project managed to track down contributors and relicensed to GPLv2 or any later version. It is a question of having the will and the discipline to achieve this. If you do not know who wrote your project’s code, not even permissive licences will protect you from claims of tainted code, should such claims ever arise.

The Fear Factor

Contrary to popular belief, all licences require someone to do (or not do) something. When people are not willing to go along with what a licence requires, we get into the territory of licence violation, unless people are taking the dishonest route of not upholding the licence and thus potentially betraying their project’s contributors. And when people fall foul of the licence, either inadvertently or through dishonesty, people want to know what might happen next.

It is therefore interesting that the article chooses to dignify claims of a GPL “death penalty”, given that such claims are largely made by people wanting to scare off others from Free Software, as was indeed shown when there may have been money and reputations to be made by engaging in punditry on the Google versus Oracle case. Not only have the actions taken to uphold the GPL been reasonable (contrary to insinuations about “infamous” reputations), but the licence revision process actually took such concerns seriously: version 3 of the GPL offers increased confidence in what the authors of the GPL family of licences actually meant. Obviously, by shunning GPLv3 and stipulating GPLv2 “only”, recipients of code licensed in such a way do not get the benefit of such increased clarity, but it is still likely that the fact that the licence authors sought to clarify such things may indeed weigh on interpretations of GPLv2, bringing some benefit in any case.

The Scapegoat

People like to invoke outrage by mentioning Richard Stallman’s name and some of the things he has said. Unfortunately for those people, Stallman has frequently been shown to be right. Interestingly, he has been right about issues that people probably did not consider to be of serious concern at the time they were raised, so that mentions of patents in GPLv2 not only proved to be far-sighted and useful in ensuring at least a workable level of protection for Free Software developers, but they also alerted Free Software communities, motivated people to resist patent expansionism, and predicted the unfortunate situation of endless, costly litigation that society currently suffers from. Such things are presumably an example of “specific usecases that were relevant at the time the license was written” according to the article, but if licence authors ignore such things, others may choose to consider them and claim some freedom in interpreting the licence on their behalf. In any case, should things like patents and buy-to-rent business models ever become extinct, a tidying up of the licence text for those who cannot bear to be reminded of them will surely do just fine.

Especially certain elements in the Python community seem to have a problem with Stallman and copyleft licensing, some blaming disagreements with, and the influence of, the FSF during the Python 1.6 licensing fiasco where the FSF rightly pointed out that references to venues (“Commonwealth of Virginia”) and having “click to accept” buttons in the licence text (with implicit acceptance through usage) would cause problems. Indeed, it is all very well lamenting that the interactions of licences with local law is not well understood, but one would think that where people have experience with such matters, others might choose to listen to such opinions.

It is a misrepresentation of Stallman’s position to claim that he wants strong copyright, as the article claims: in fact, he appears to want a strengthening of the right to share; copyleft is only a strategy to achieve this in a world with increasingly stronger copyright legislation. His objections to the Swedish Pirate Party’s proposals on five year copyright terms merely follow previous criticisms of additional instruments – in this case end-user licence agreements (EULAs) – that allow some parties to circumvent copyright restrictions on other people’s work whilst imposing additional restrictions – in previous cases, software patents – on their own and others’ works. Finding out what Stallman’s real position might require a bit of work, but it isn’t secret and in fact even advocates significantly reduced copyright terms, just as the Pirate Party advocates. If one is going to describe someone else’s position on a topic, it is best not to claim anything at all if the alternative is to just make stuff up instead.

The article ramps up the ridicule by claiming that the FSF itself claims that “cloud computing is the devil, cell phones are exclusively tracking devices”. Ridiculing those with legitimate concerns about technology and how it is used builds a culture of passive acceptance that plays into the hands of those who will exploit public apathy to do precisely what people labelled as “paranoid” or “radical” had warned everyone about. Recent events have demonstrated the dangers of such fashionable and conformist ridicule and the complacency it builds in society.

All Things to All People

Just as Richard Stallman cannot seemingly be all things to all people – being right about things like the threat of patents, for example, is just so annoying to those who cannot bring themselves to take such matters seriously – so the FSF and the GPL cannot be all things to all people, either. But then they are not claiming to be! The FSF recognises other software licences as Free Software and even recommends non-copyleft licences from time to time.

For those of us who prefer to uphold the rights of the end-user, so that they may exercise control over their computing environment and computing experience, the existence of the GPL and related copyleft licences is invaluable. Such licences may be complicated, but such complications are a product of a world in which various instruments are available to undermine the rights of the end-user. And defining a predictable framework through which such licences may be applied is one of the responsibilities that the FSF has taken upon itself to carry out.

Indeed, few other organisations have been able to offer what the FSF and closely associated organisations have provided over the years in terms of licensing and related expertise. Maybe such lists of complaints about the FSF or the GPL are a continuation of the well-established advertising tradition of attacking a well-known organisation to make another organisation or its products look good. The problem is that nobody really looks good as a result: people believe the bizarre insinuations of political propaganda and are less inclined to check what the facts say on whichever matter is being discussed.

People are more likely to make bad choices when they have only been able to make uninformed choices. The article seeks to inform people about some of the practicalities of licence compatibility but overemphasises sources with an axe to grind – and, in some cases, sources with rather dubious motivations – that are only likely to drive people away from reliable sources of information, filling the knowledge gap of the reader with innuendo from third parties instead. If the intention is to promote permissive licensing or merely licences that are shorter than the admittedly lengthy GPL, we would all be better served if those wishing to do so would stick to factual representations of both licensing practice and licence author intent.

And as for choosing a licence, some people have considered such matters before. Seeking to truly understand licences means having all the facts on the table, not just the ones one would like others to consider combined with random conjecture on the subject. I hope I have, at least, brought some of the missing facts to the table.

Ubuntu Edge: Making Things Even Harder for Open Hardware?

Wednesday, July 24th, 2013

The idea of a smartphone supportive of Free Software, using hardware that can be supported using Free Software, goes back a few years. Although the Openmoko Neo 1973 attracted much attention back in 2007, not only for its friendliness to Free Software but also for the openness around its hardware design, the Trolltech Greenphone had delivered, almost a full year before the Neo, a hardware platform that ran mostly Free Software and was ultimately completely supported using entirely Free Software (something that had been a matter of some earlier dispute). Unfortunately, both of these devices were discontinued fairly quickly: the Greenphone was more a vehicle to attract interest in the Qt-based Qtopia environment amongst developers, existing handset manufacturers and operators, and although the Neo 1973 was superseded by the Neo FreeRunner, the commercial partner of the endeavour eventually chose to abandon development of the platform and further products of this nature. (Openmoko now sells a product called WikiReader, which is an intriguing concept in itself, principally designed as an offline reader for Wikipedia.)

What survived the withdrawal of Openmoko from the pursuit of the Free Software smartphone was the community or communities around such work, having taken an active interest in developing software for such devices and having seen the merits of being able to influence the design of such devices through the principles of open hardware. Some efforts were made to continue the legacy: the GTA04 project develops and offers replacement hardware for the FreeRunner (known as GTA02 within the Openmoko project) using updated and additional components; a previous “gta02-core” effort attempted to refine the development process and specification of a successor to the FreeRunner but did not appear to produce any concrete devices; a GTA03 project which appeared to be a more participative continuation of the previous work, inviting the wider community into the design process alongside those who had done the work for the previous generations of Neo devices, never really took off other than to initiate the gta02-core effort, perhaps indicating that as the commercial sponsor’s interest started to vanish, the community was somewhat unreasonably expected to provide the expertise withdrawn by the sponsor (which included a lot of the hardware design and manufacturing expertise) as well as its own. Nevertheless, there is a degree of continuity throughout the false starts of GTA03 and gta02-core through to GTA04 and its own successes and difficulties today.

Then and Now

A lot has happened in the open hardware world since 2007. Platforms like Arduino have become very popular amongst electronics enthusiasts, encouraging the development of derivatives, clones, accessories and an entire marketplace around experimentation, prototyping and even product development. Other long-established microcontroller-based solution vendors have presumably benefited from the level of interest shown towards Arduino and other “-duino” products, too, even if those solutions do not give customers the right to copy and modify the hardware as Arduino does with its hardware licensing. Access to widely used components such as LCD panels has broadened substantially with plenty of reasonably priced products available that can be fairly easily connected to devices like the Arduino, BeagleBoard, Raspberry Pi and many others. Even once-exotic display technologies like e-paper are becoming accessible to individuals in the form of ready-to-use boards that just plug into popular experimenter platforms.

Meanwhile, more sophisticated parts of the open hardware world have seen their own communities develop in various ways. One community emerging from the Openmoko endeavour was Qi-Hardware, supported by Sharism who acquired the rights to produce the Ben NanoNote from the vendor of an existing product, thus delivering a device with completely documented electronics hardware, every aspect of which can be driven by Free Software. Unfortunately, efforts to iterate on the concept stalled after attempts to make improved revisions of the Ben, presumably in preparation to deliver future versions of the NanoNote concept. Another project founded under the Qi-Hardware umbrella has been extending the notion of “copyleft hardware” to system on a chip (SoC) solutions and delivering the Milkymist platform in the shape of the Milkymist One video synthesizer. Having dealt with commercially available but proprietary SoC solutions, such as the SoC used in the Ben NanoNote, there appears to be a desire amongst some to break free of the dependency on silicon vendors and their often poorly documented products and to take control not only of the hardware using Free Software tools, but also to decide how the very hardware platform itself is designed and built.

There are plenty of other hardware development initiatives taking place – OpenPandora, the EOMA-68 initiative, the Vivaldi KDE tablet (which is now going to be based on EOMA-68 hardware), the Novena open laptop – many of which have gained plenty of experience – sometimes very hard-earned experience – in getting hardware designed and produced. Indeed, the history of the Vivaldi initiative seems to provide a good illustration of how lessons that others have already learned are continuing to be learned independently: having negotiated manufacturing whilst suffering GPL-violating industry practices, the manufacturer changed the specification and rendered a lot of the existing work useless (presumably the part supporting the hardware with Free Software drivers).

In short, if you are considering designing a device “to run Linux”, the chances are that someone else is already doing just that. When people suggest that you look at various other projects or initiatives, they are not doing so to inflate the reputation of those projects: it is most likely the case that people associated with those projects can give you advice that will save you time and effort, even if there is no further collaboration to be had beyond exchanges of useful information.

The Competition for Attention

Ubuntu Edge – the recently announced, crowd-funded “dockable” smartphone – emerges at a time when there are already many existing open hardware projects in need of funding. Those who might consider supporting such worthy efforts may be able to afford supporting more than one of them, but they may find it difficult to justify doing so. Precious few details exist of the hardware featured in the Ubuntu Edge product, and it would be reasonable to suspect given the emphasis on specifications and features that it will not be open hardware. Moreover, given the tendency of companies wishing to enter the smartphone market to do so as conveniently as possible by adopting the “chipset of the month”, combined with the scarcity of silicon favouring true Free Software support, we might also suspect that the nature of the software support will be less than what we should be demanding: the ability to modify and maintain the software in order to use the hardware indefinitely and independently of the vendor.

Meanwhile, other worthy projects beyond the open hardware realm compete for the money of potential sponsors and donors. The Fairphone initiative has also invited people to pledge money towards the delivery of devices, although in a more tangible fashion than Ubuntu Edge, with genuine plans having been made for raw materials sourcing and device manufacture, and with software development supposedly undertaken on behalf of the project. As I noted previously, there are some unfortunate shortcomings with the Fairphone initiative around the openness of the software, and unless the participants are able to change the mindset of the chipset vendor and the suppliers of various technologies incorporated into the chipset, sustainable Free Software support may end up being dependent on reverse-engineering efforts. Mozilla’s Firefox OS, meanwhile, certainly emphasises a Free Software stack along with free and open standards, but the status of the software support for certain hardware functions are likely to be dependent on the details of the actual devices themselves.

Interest in open phones is not new, nor even is interest in “dockable” smartphones, and there are plenty of efforts to build elements of both while upholding Free Software support and even the principles of open hardware. Meanwhile, the Ubuntu Edge campaign provides no specifics about the details of the hardware; it is thus unable to make any commitment about Free Software drivers or binary firmware “blobs”. Maybe the intention is to one day provide things like board layouts and case designs as resources for further use and refinement by the open hardware community, but the recent track-record of Canonical and Ubuntu with secretive and divisive – or at least not particularly transparent or cooperative – product development suggests that this may be too much to hope.

Giving the Gift

$32 million is a lot of money. Broken into $600 chunks with the reward of the advertised device, or a consolation prize of your money back minus a few percent in fees and charges if the fund-raising campaign fails to reach its target, it is a lot of money for an individual, too. (There is also the worst-case eventuality that the target is met but the product is not delivered, at which point everybody might have found that they have merely made a donation towards a nice but eventually unrealisable or undeliverable idea.) One could do quite a bit of good work with even small multiples of $600, and with as much as around 0.5% of the Ubuntu Edge campaign target, one could fund something like the GCW Zero. That might not aggressively push back the limits of mobile technology on every front, but it gives people something different and valuable to them while still leaving plenty of money floating around looking for a good cause.

But it is not merely about the money, even though many of those putting down money for the Ubuntu Edge are likely to have ruled out doing the same for the Fairphone (and perhaps some of those who have ordered their Fairphone regret placing their order now that the Ubuntu Edge has made its appearance), purely because they neither need nor can reasonably afford or justify buying two new smartphones for delivery at some point in the future. The other gift that could be given is collaboration and assistance to the many projects already out there toiling to put Linux on some SoC or other, developing an open hardware design for others to use and improve, and deepening community expertise that might make these challenges more tolerable in the future.

Who knows how the Ubuntu Edge will be developed if or when the funding target is reached, or regardless of it being reached? But imagine what it would be like if such generosity could be directed towards existing work and if existing and new projects were able to work more closely with each other; if the expertise in different projects could be brought in to make some new endeavour more likely to succeed and less fraught with problems; if communities were included, encouraged to participate, and encouraged to take their own work further to enrich their own project and improve any future collaborations.

Investing, not Purchasing

$32 million is a lot of money. Less exciting things (to the average gadget buyer) like the OpenRISC funding drive to produce an ASIC version of an open hardware SoC wanted only $250000 – still a lot of money, but less than 1% of the Ubuntu Edge campaign target – and despite the potential benefits for both individuals and businesses it still fell far short of the mark, but if such projects were funded they might open up opportunities that do not exist now and would probably still not exist if Ubuntu got their product funded. And there are plenty of other examples where donations are more like investments in a sustainable future instead of one-off purchases of nice-looking gadgets.

Those thinking about making a Free Software phone might want to check in with the GTA04 project to see if there is anything they can learn or help out with. Similarly, that project could perhaps benefit from evaluating the EOMA-68 initiative which in turn could consider supporting genuinely open SoCs (and also removing the uncertainty about patent assertion for participants in the initiative by providing transparent governance mechanisms and not relying on the transient goodwill of the current custodians). As expertise is shared and collaboration increases, the money might start to be spread around a bit more as well, and cash-starved projects might be able to do things before those things become less interesting or even irrelevant because the market has moved on.

We have to invest both financially and collaboratively in the good work already taking place. To not do so means that opportunities that are almost within our grasp are not seized, and that people who have worked hard to offer us such opportunities are let down. We might lose the valuable expertise of such people through pure disillusionment, and yet the casual observer might still wonder when we might see the first fully open, Free Software friendly, mass-market-ready smartphone, thinking it is simply beyond “the community” to deliver. In fact, we might be letting the opportunity to deliver such things pass us by more often than we realise, purely out of ignorance of the ongoing endeavours of the community.

Diversions and Distractions

Ubuntu Edge sounds exciting. It is just a shame that it does not appear to enable and encourage everyone who has already been working to realise such ambitions on substantially lower budgets and with less of a brand reputation to cultivate the interest of the technology media and enthusiastic consumers. Millions of dollars of committed funds and an audience preferring to take the passive position of expectant customers, as opposed to becoming active contributors to existing efforts, all adds up to a diversion of participation and resources from open hardware projects.

Such blockbuster campaigns may even distract from open hardware projects because for those who might need slight persuasion to get involved, the apparition of an easy solution demanding only some spare cash and no intellectual investment may provide the discouragement necessary to affirm that as with so many other matters, somebody else has got them covered. Consequently, such people retreat from what might have been a rewarding pursuit that deepens their understanding of technology and the issues around it.

Not everyone has the time or inclination to get involved with open hardware, of course, especially if they are starting with practically no knowledge of the field. But with many people and their green pieces of paper parked and waiting for Ubuntu Edge, it is certainly possible to think that the campaign might make things even harder for the open hardware movement to get the recognition and the traction it deserves.