Brian Gough’s Notes


Archive for the ‘software’ Category

Some pre-release checks for GNU packages

Monday, April 4th, 2011

Here’s a list of pre-release checks I’ve found useful over the years. Most of them should apply to any package, but they’ve been used for a C library (“libfoo” in the examples below, with exported header files foo/*.h and source files src/*.c).

  • Make sure all internal functions are static, and all exported functions & variables prefixed with foo_ or FOO_. Here’s a command for checking this:
    nm -A -g -P *.a  | perl -a -n -e 'print if $F[1] !~ /foo_/ && $F[2] ne "U"'
  • Make sure config.h is used consistently – it should be present in source files, but not in any exported headers.
    grep config.h foo/*.h    --- shouldn't match anything
    grep -L config.h src/*.c   --- gives files not using config.h
  • Make a master header file of all exported headers and try compiling it to test for conflicts between different header files.
    cat foo/*.h > tmp.c
    gcc tmp.c
  • Check that the library passes “make check” with various options. Some good ones are:
    -O3
    -mfpmath=sse -msse2
    -funsigned-char (to simulate Power/ARM)
    -std=c89 -ansi -pedantic (now that GCC uses c99 by default instead of c89)
  • Test with valgrind. You can run all the tests under valgrind with the TESTS_ENVIRONMENT variable
     $ TESTS_ENVIRONMENT="valgrind --tool=memcheck" make check

    This only works if configure was run with –disable-shared, because otherwise the shared library libtool shell-scripts themselves are run under valgrind.

    Regarding memory leaks, the GNU coding standards say “Memory leak detectors such as valgrind can be useful, but don’t complicate a program merely to avoid their false alarms. For example, if memory is used until just before a process exits, don’t free it simply to silence a leak detector.”

    For libraries I’ve found it’s very helpful (actually, essential) to have a test suite that is leak-free. Then if any test program exits with memory still allocated it indicates a problem. This is how I have caught any memory leaks during development.

  • Do make distcheck the local copy of install-sh provided by automake, in addition to /usr/bin/install. Depending on the end user’s environment, either of these could end up being used.
    make distcheck # will use /usr/bin/install
    make distcheck INSTALL=$HOME/foo/install-sh  # absolute location of install-sh

    When you do “make install” yourself, it will probably use /usr/bin/install since that is present on GNU/Linux systems. If so, you would not detect any problem with the install-sh shipped in your package – for example, if it had got corrupted for some reason (e.g. a stray edit or in my case, after doing a repository conversion I somehow ended up with my working directory containing the first ever version of install-sh that was checked in – which dated from 1994 – it didn’t work properly on modern systems, but I never noticed that because I was always using /usr/bin/install).

  • Try running the build and make check with a memory limit to be sure that excessive memory is not required to do the compilation.
    ulimit -v 65536  #  some reasonable memory limit
    make
    make check
  • Check that the texinfo manual works for postscript/pdf
    make dvi  # or make pdf

    It’s easy to miss a diagram or other texinput file that is needed by texi2dvi but not by makeinfo.

GNU Parallel – a map operator for the command line

Thursday, January 13th, 2011

GNU Parallel is a nice recent addition to my unix toolkit. It can do a lot of things, my favorite is as a “map” operator for the command-line, automatically parallelising over the arguments of a command.

The command

parallel command {} ::: arg1 arg2 arg3 ...

is equivalent to

 command arg1 &
 command arg2 &
 command arg3 &
 ...

run in parallel. The {} is replaced by each argument in turn.

As part of the regular updates to GSRC I need to run “make check-update” for several hundred directories for all the gnu packages. With a for loop that takes a long time, as “check-update” examines remote urls to see if a new version of a package has been released. With parallel I can use

parallel make -C {} check-update ::: gnu/*/

to get the same effect but in parallel (by default 9 processes simultaneously – to change it use -j N).

The author of GNU Parallel, Ole Tange, is giving a talk at FOSDEM (Brussels) in the GNU Dev room on Saturday 5 Feb 2011, see http://www.gnu.org/ghm/2011/fosdem/ for details. Hope to see you there!

2048-bit GPG Smartcards and Package Signing

Thursday, April 15th, 2010

I received a new 2048-bit RSA version 2 GPG smartcard today (ordered from Kernel Concepts). Previously I was using the older version 1.0 and 1.1 smartcards, with 1024-bit keys.

I’ve been signing software releases with a GPG smartcard for several years now (before that, with a key stored on disk) and have been migrating my systems over to smartcards for keysigning and SSH. The ultimate goal is to not have any keys stored on disk on any network accessible machine. I also verify the signatures of sources that I download as far as possible, through the web of trust. Initially this was pretty restrictive but after a few years making an effort to keysign at conferences, I’m able to check most packages.

During the keysigning session at the FSF’s LibrePlanet conference last month in Boston, Bradley Kuhn mentioned that he had actually built a basic working GNU/Linux system from scratch for crypto purposes, verifying all of the package signatures through people he had keysigned with — quite an achievement. I am inspired to follow in his footsteps and only use verified source-code.

Unfortunately, as far as I can tell — and I’m ready to be corrected here — neither GNOME nor KDE sign their source releases, which does concern me. Considering that most other projects have been signing releases for years, this appears to be an anomaly that I find hard to understand.

My personal motivation for better security dates back to 2003 when it was discovered that someone (or group) had cracked the ftp.gnu.org server and had root access for over 3 months without being detected. As a result every maintainer had to do a complete audit of all files on the server, which was an extremely timeconsuming process. This incident led to the requirement for all source packages on ftp.gnu.org to be gpg-signed by the developer.

Version 2 GPG Smartcard:

gpgcard2frontsmall

gpgcard2backsmall

A year 2010 problem

Friday, January 15th, 2010

If you are using Spamassassin and seem to be getting less mail since the start of the year, take note of Spamassassin bug 6269 which describes how all mail sent in the year 2010 gets an extra 3.2 spam points simply because the “date is grossly in the future” according to a rule from 2006 with a hard-coded date.

ASP2PHP – a useful tool

Tuesday, May 31st, 2005

Needing to migrate a client’s website from Windows to GNU/Linux I recently discovered Michael Kohn’s ASP2PHP for converting Microsoft Active Server Pages to PHP.

The site was fairly straightforward, using ASP for menus and contact forms, and the converter worked first time. The software is only at version 0.76, and had a few rough edges, so I was impressed that it worked so smoothly. Up to that point it had seemed like the only game in town was the proprietary Chilisoft, so this was a good discovery.

The project homepage is http://asp2php.naken.cc/.