## OpenStreetMap considers new licence

The OSM board have just sent notice that they’ve set the end of 2008 as their deadline to produce the new licence for OSM. The current draft being discussed, which you might like to take a look at, is that of April 2008:

And FWIW, the board meeting minutes have been put online too.

## Why European software patents are legally invalid

The European Patent Convention generally defines whether ideas in a domain are patentable or not. The pertinent part is Article 52 which says:

"Patentable inventions

(1) European patents shall be granted for any inventions, in all fields of technology, provided that they are new, involve an inventive step and are susceptible of industrial application.

(2) The following in particular shall not be regarded as inventions within the meaning of paragraph 1:

(a) discoveries, scientific theories and mathematical methods;

(b) aesthetic creations;

(c) schemes, rules and methods for performing mental acts, playing games or doing business, and programs for computers;

(d) presentations of information.

(3) Paragraph 2 shall exclude the patentability of the subject-matter or activities referred to therein only to the extent to which a European patent application or European patent relates to such subject-matter or activities as such."

So "programs for computers / shall not be regarded as inventions / as such".

In the 1990s, the European Patent Office created a bizarre interpretation whereby "as such" is a reference to "as programs for computers", and thus the exclusion can be completely ignored if the patent application uses a name other than "programs for computers" for the claimed idea. So if I have an idea related to a program for a computer, and I want to patent that idea *as a computer implemented invention* then that’s no problem. The exclusion is thus a mere formality with no substance, according to the EPO.

Which invites the question: if the drafters intended the exclusion to be meaningless, why did they bother adding it? Of course, the EPO’s interpretation isn’t at all what was intended.

A second obvious problem with the EPO’s interpretation is that it doesn’t just render meaningless the exclusion of computer programs. It renders all the exclusions meaningless, so games, doing business, scientific theories, "rules and methods for performing mental acts" (yes, ways of using your brain), and all the other things listed in Paragraph 2 of Article 52 should be patentable. Which is completely absurd.

Unfortunately, a UK appeal court has recently upheld this bizarre twisting of patents – and that article mis-reports the patent dangers as "protection" for software developers.

## EU states to discuss Internet filtering

The French government is likely to lobby the other EU member states to support disconnecting people from the Internet without a court case. The French government first tried to convince the European Parliament (EP), but that backfired and the EP adopted a text (amendments 138, 166) stating that a judicial process should always be necessary (Sept 24th). Then Sarkozy sent a letter (Oct 3rd, page 2 paragraph 1) to the European Commission asking them to reject the EP’s amendment, but the Commission has rejected Sarkozy’s request (Oct 7th).

So the remaining option is for Sarkozy to convince the other EU member states to oppose the European Parliament’s amendments. The EU member states form the European Council, and they have the power during the current stage of the EU legislative process. So letters will have to be sent to the relevent minister in each national government regarding this issue.

This isn’t a direct threat to free software, but Sarkozy’s proposal is to give control over Internet connections to the Music industry. Internet connections are important for free software users and developers, and the Music industry is practically always our opponent on legislative issues.

## New monthly feature: Fellowship interviews

We’ve started a series of monthly Fellowship interviews, as many probably noticed (thanks to LWN, FSDaily, GNUvox, Linux.com, and Groklaw).

There’s an RSS feed and a permanent URL:

Any Fellow of FSFE can be nominated to be interviewed. In fact, we need nominations: we don’t know every Fellow, so to find good candidates, we need suggestions from you. Let me know, or send suggestions to fellowship [a] fsfeurope dot org.

## 3 articles with RMS

There’s a new essay on gnu.org about Avoiding Ruinous Compromises. I guess the main point is that if we want the freedoms of free software to eventually come as standard, we need people to take freedom into consideration when choosing software. Or as Stallman puts it, we need to change people’s mindset, rather than looking for short-term gains by appealing to people’s existing mindset.

The second is an interview looking back at 25 years of GNU. He’s asked at the end if he’s discouraged by the rate of progress of the free software movement, but he replies: "It’s a strange thing, but at least in the area of free software, we’re making progress, whereas in all other areas of human rights, the world is getting worse".

And there’s one on guardian.co.uk, denouncing cloud computing. This generated quite a discussion on reddit.com. (Thanks to Matthias for hightlighting this one)

Update 2008-10-06: RMS’s cloud comments have made quite a stir, getting on Linux Journal, Slashdot, Computer world, and many others.

## What organisations not to join

(Post publication note: I hoped to come back and finish this, but haven’t found time yet. As a software freedom lobbyist in Brussels, I’m worried by the prospect of anti-free-software corporations being able to claim to represent the free software community. Funding software development is fine, but we need to make clear that, politically, these corporations don’t represent us. I’ll try to explain my point further in the coming weeks. 2008-10-06)

Some organisations need your support. Others don’t. Most people can guess what organisation I recommend supporting, but this blog entry is about how I rule the organisations that don’t need my support.

Take an example organisation whose annual budget is about $5 million, and who gets almost 80% of it’s funding from these nine companies: Fujitsu, HP, Hitachi, IBM, Intel, NEC, Novell, and Oracle. This looks like a Who’s Who of pro-software patent campaigners – they’re only missing Microsoft. So this consortium is working for companies that lobbied against FSFE, FFII, and all the SMEs, national organisations, and individuals who gave their time and money to keep Europe free from software patents. (I know the example organisation has some projects related to software patents, but they’re not what anti-swpat organisations have asked for.) It’s also clear that almost all of them earn more money from proprietary software than they do from free software. It’s safe to say that individual$50 members will never have any financial say in this consortium’s work.

So who makes the decisions? The board: Larry Augustin, Alan Clark (Novell), Wim Coekaerts (Oracle), Masahiro Date (Fujitsu), Frank Fanzilli, Doug Fisher (Intel), Dan Frye (IBM), Hisashi Hashimoto (Hitachi), Randy Hergett (HP), Brian Pawlowski (NetApp), Chris Schlaeger (AMD), Tsugikazu Shibata (NEC), Mark Shuttleworth, Eric Thomas (Texas Instruments), Christy Wyatt (Motorola).

When I see this, I ask myself if there’s anything that is useful for free software that could get majority support among them. They do fund useful software development, but the market value of free software has already lead corporations who don’t share our values to employ thousands of free software developers.

Organisations with massive corporate funding, with motivations contrary to those of the GNU/Linux using community anyway, should be content with their millions and should not be asking individuals to dig into their own pockets.

## Japanese PDFs part 2: XeTeX

(Last month’s article: Using LaTeX to make PDF documents with Japanese characters)

I’ve found a better TeX tool for making Japanese PDFs: XeTeX. Below are first the technical advantages, and then an analysis of community and sustainability.

XeTeX is a version of Tex that has been modified to use Unicode (UTF-8) encoding internally. It is also configured to work with modern font tools such as FreeType and fontconfig. With XeTeX, the minimal example from my last article becomes:

\documentclass[12pt]{article}\usepackage{fontspec}

\setmainfont{Sazanami Mincho}

\begin{document}

\section{What I learned today}I can write this 私はキランです in Japanese.

\end{document}

This is converted to a PDF with the command line tool xelatex. XeTeX has been part of the very common TeX Live bundle since TeX-Live-2007. So if LaTeX is available for your GNU/Linux distro, I’m sure TeX Live is too, and thus XeTeX. (TeX-Live-2008 will be released soon.)

(For a more complex example, see jlesson002.tex, and the output jlesson002.pdf.)

One improvement in this example is that I wrote the file in the very common UTF-8 encoding. This means I don’t have to tell my applications to use the JP-EUC format that LaTeX+CJK would have required, and it means I’m less likely to have compatibility problems with other text processing tools. (This article was actually supposed to be about converting Japanese TeX to plain text, but an application’s lack of support for JP-EUC encoding led me to research UTF-8 versions of TeX.)

A second improvement is that I could use the standard "article" document class. When using CJK, you can only use document classes that have been specifically written to work with CJK. There is a CJK-enabled equivalent for "article", called "scrartcl", but for some others classes, there’s no equivalent that works with CJK.

Another improvement is that the font is specified in a much more readable way ("Sazanami Mincho"), and if I want to use another font, I can use this fontconfig command at the shell to find all fonts on my system that include Japanese characters:

fc-list :lang=ja

On my system, this finds six fonts. The differences between Gothic and Mincho are roughly equivalent to sans-serif and serif fonts in Western scripts.

It’s hard to find a list of free Japanese fonts. It seems that many Japanese font developers have invented their own licences. Two free fonts available are Kochi and Sazanami, of which some say the latter is slightly better, but I can’t see any difference. There is also a font called "UmePlus", which seems to be free, but is missing from some distributions (such as Debian) because the licence is somewhat unclear (but it looks fine to me). When I say "free", I mean it in the free software sense, e.g. that everyone can use, copy, modify, and redistribute (modified or unmodified).

Note: I set the default font to a Japanese font because my documents are wholly/mostly in Japanese. If you just wanted to add some Japanese to a mostly English document, XeTeX is still a good option, but I won’t go into how to do that (it involves defining a Japanese environment and beginning the environment, entering Japanese, then ending the environment).

A last, minor technical improvement is output file size. For a one-line test file, pdflatex made a file of 19.6kb, and xelatex made one of only 7.5kb. For a more complex 1-page file (jlesson002.tex), the XeTeX output was 15.1kb, and when I converted it to LaTeX-CJK, pdflatex made a file of 65.2kb.

What about community support and sustainability?

Is it safe to move from the old reliable LaTeX+CJK package to this new XeTeX thing? Will XeTeX still have a developer community in the future? Will developers of other TeX tools take care to ensure their packages work with XeTex? What do Japanese TeX users use?

My searches suggest that Japanese TeX users are using a mix of tools. Some use pTeX, which is a version of TeX modified specifically to work with Japanese. Others use LaTeX+CJK. But there seems to be consensus that these are tools of the past and that Unicode is the future. So change is coming.

Japanese top Tex expert Haruhiko Okumura said in April 2007: "Since pTeX for Unicode is now being developed and XeTeX is acquiring pTeX-like versatility, next year I’ll be using either the new pTeX or XeTeX."

The pTeX for Unicode project he’s referring to is uptex. It exists, but seems to be still in alpha (early testing) stage. It isn’t available in the Debian archives, but someone has made Debian uptex packages. (I haven’t tested them.)

If Mr. Okumura has now adopted upTeX or XeTeX, I bet he chose XeTeX.

Next, I got really scientific. I put a few combinations of words into search engines, each time including "2008", a Japanese word, and either "uptex" or "xetex". Each time, XeTeX won by miles. So I guess Japanese people are not currently using uptex. I think XeTeX is winning the battle for Unicode TeX in Japan.

XeTeX being accepted into the TeX Live bundle is also a strong endorsement that XeTeX’s future is safe, and the mainainer of LaTeX-CJK is discussing if it and XeTeX can be merged.

The only bad sign I saw about XeTeX is that the maintainer has recently resigned his job, but, he says this shouldn’t affect his ability to maintain XeTeX.

Ok, so that’s this month’s TeX wisdom from a newbie Hopefully next month’s article will be about generating plain text files from the same Japanese TeX source files used for generating PDFs. Final note: I’m pretty sure all these tips work for Chinese, Korean, and other foreign characters, but I haven’t tried that yet.