Paul Boddie's Free Software-related blog

Archive for the ‘hardware’ Category

EOMA68: The Campaign (and some remarks about recurring criticisms)

Thursday, August 18th, 2016

I have previously written about the EOMA68 initiative and its objective of making small, modular computing cards that conform to a well-defined standard which can be plugged into certain kinds of device – a laptop or desktop computer, or maybe even a tablet or smartphone – providing a way of supplying such devices with the computing power they all need. This would also offer a convenient way of taking your computing environment with you, using it in the kind of device that makes most sense at the time you need to use it, since the computer card is the actual computer and all you are doing is putting it in a different box: switch off, unplug the card, plug it into something else, switch that on, and your “computer” has effectively taken on a different form.

(This “take your desktop with you” by actually taking your computer with you is fundamentally different to various dubious “cloud synchronisation” services that would claim to offer something similar: “now you can synchronise your tablet with your PC!”, or whatever. Such services tend to operate rather imperfectly – storing your files on some remote site – and, of course, exposing you to surveillance and convenience issues.)

Well, a crowd-funding campaign has since been launched to fund a number of EOMA68-related products, with an opportunity for those interested to acquire the first round of computer cards and compatible devices, those devices being a “micro-desktop” that offers a simple “mini PC” solution, together with a somewhat radically designed and produced laptop (or netbook, perhaps) that emphasises accessible construction methods (home 3D printing) and alternative material usage (“eco-friendly plywood”). In the interests of transparency, I will admit that I have pledged for a card and the micro-desktop, albeit via my brother for various personal reasons that also delayed me from actually writing about this here before now.

An EOMA68 computer card in a wallet

An EOMA68 computer card in a wallet (courtesy Rhombus Tech/Crowd Supply)

Of course, EOMA68 is about more than just conveniently taking your computer with you because it is now small enough to fit in a wallet. Even if you do not intend to regularly move your computer card from device to device, it emphasises various sustainability issues such as power consumption (deliberately kept low), long-term support and matters of freedom (the selection of CPUs that completely support Free Software and do not introduce surveillance backdoors), and device longevity (that when the user wants to upgrade, they may easily use the card in something else that might benefit from it).

This is not modularity to prove some irrelevant hypothesis. It is modularity that delivers concrete benefits to users (that they aren’t forced to keep replacing products engineered for obsolescence), to designers and manufacturers (that they can rely on the standard to provide computing functionality and just focus on their own speciality to differentiate their product in more interesting ways), and to society and the environment (by reducing needless consumption and waste caused by the upgrade treadmill promoted by the technology industries over the last few decades).

One might think that such benefits might be received with enthusiasm. Sadly, it says a lot about today’s “needy consumer” culture that instead of welcoming another choice, some would rather spend their time criticising it, often to the point that one might wonder about their motivations for doing so. Below, I present some common criticisms and some of my own remarks.

(If you don’t want to read about “first world” objections – typically about “new” and “fast” – and are already satisfied by the decisions made regarding more understandable concerns – typically involving corporate behaviour and licensing – just skip to the last section.)

“The A20 is so old and slow! What’s the point?”

The Allwinner A20 has been around for a while. Indeed, its predecessor – the A10 – was the basis of initial iterations of the computer card several years ago. Now, the amount of engineering needed to upgrade the prototypes that were previously made to use the A10 instead of the A20 is minimal, at least in comparison to adopting another CPU (that would probably require a redesign of the circuit board for the card). And hardware prototyping is expensive, especially when unnecessary design changes have to be made, when they don’t always work out as expected, and when extra rounds of prototypes are then required to get the job done. For an initiative with a limited budget, the A20 makes a lot of sense because it means changing as little as possible, benefiting from the functionality upgrade and keeping the risks low.

Obviously, there are faster processors available now, but as the processor selection criteria illustrate, if you cannot support them properly with Free Software and must potentially rely on binary blobs which potentially violate the GPL, it would be better to stick to a more sustainable choice (because that is what adherence to Free Software is largely about) even if that means accepting reduced performance. In any case, at some point, other cards with different processors will come along and offer faster performance. Alternatively, someone will make a dual-slot product that takes two cards (or even a multi-slot product that provides a kind of mini-cluster), and then with software that is hopefully better-equipped for concurrency, there will be alternative ways of improving the performance to that of finding faster processors and hoping that they meet all the practical and ethical criteria.

“The RasPi 3…”

Lots of people love the Raspberry Pi, it would seem. The original models delivered a cheap, adequate desktop computer for a sum that was competitive even with some microcontroller-based single-board computers that are aimed at electronics projects and not desktop computing, although people probably overlook rivals like the BeagleBoard and variants that would probably have occupied a similar price point even if the Raspberry Pi had never existed. Indeed, the BeagleBone Black resides in the same pricing territory now, as do many other products. It is interesting that both product families are backed by certain semiconductor manufacturers, and the Raspberry Pi appears to benefit from privileged access to Broadcom products and employees that is denied to others seeking to make solutions using the same SoC (system on a chip).

Now, the first Raspberry Pi models were not entirely at the performance level of contemporary desktop solutions, especially by having only 256MB or 512MB RAM, meaning that any desktop experience had to be optimised for the device. Furthermore, they employed an ARM architecture variant that was not fully supported by mainstream GNU/Linux distributions, in particular the one favoured by the initiative: Debian. So a variant of Debian has been concocted to support the devices – Raspbian – and despite the Raspberry Pi 2 being the first device in the series to employ an architecture variant that is fully supported by Debian, Raspbian is still recommended for it and its successor.

Anyway, the Raspberry Pi 3 having 1GB RAM and being several times faster than the earliest models might be more competitive with today’s desktop solutions, at least for modestly-priced products, and perhaps it is faster than products using the A20. But just like the fascination with MHz and GHz until Intel found that it couldn’t rely on routinely turning up the clock speed on its CPUs, or everybody emphasising the number of megapixels their digital camera had until they discovered image noise, such number games ignore other factors: the closed source hardware of the Raspberry Pi boards, the opaque architecture of the Broadcom SoCs with a closed source operating system running on the GPU (graphics processing unit) that has control over the ARM CPU running the user’s programs, the impracticality of repurposing the device for things like laptops (despite people attempting to repurpose it for such things, anyway), and the organisation behind the device seemingly being happy to promote a variety of unethical proprietary software from a variety of unethical vendors who clearly want a piece of the action.

And finally, with all the fuss about how much faster the opaque Broadcom product is than the A20, the Raspberry Pi 3 has half the RAM of the EOMA68-A20 computer card. For certain applications, more RAM is going to be much more helpful than more cores or “64-bit!”, which makes us wonder why the Raspberry Pi 3 doesn’t support 4GB RAM or more. (Indeed, the current trend of 64-bit ARM products offering memory quantities addressable by 32-bit CPUs seems to have missed the motivation for x86 finally going 64-bit back in the early 21st century, which was largely about efficiently supporting the increasingly necessary amounts of RAM required for certain computing tasks, with Intel’s name for x86-64 actually being at one time “Extended Memory 64 Technology“. Even the DEC Alpha, back in the 1990s, which could be regarded as heralding the 64-bit age in mainstream computing, and which arguably relied on the increased performance provided by a 64-bit architecture for its success, still supported 64-bit quantities of memory in delivered products when memory was obviously a lot more expensive than it is now.)

“But the RasPi Zero!”

Sure, who can argue with a $5 (or £4, or whatever) computer with 512MB RAM and a 1GHz CPU that might even be a usable size and shape for some level of repurposing for the kinds of things that EOMA68 aims at: putting a general purpose computer into a wide range of devices? Except that the Raspberry Pi Zero has had persistent availability issues, even ignoring the free give-away with a magazine that had people scuffling in newsagents to buy up all the available copies so they could resell them online at several times the retail price. And it could be perceived as yet another inventory-dumping exercise by Broadcom, given that it uses the same SoC as the original Raspberry Pi.

Arguably, the Raspberry Pi Zero is a more ambiguous follow-on from the Raspberry Pi Compute Module that obviously was (and maybe still is) intended for building into other products. Some people may wonder why the Compute Module wasn’t the same success as the earlier products in the Raspberry Pi line-up. Maybe its lack of success was because any organisation thinking of putting the Compute Module (or, these days, the Pi Zero) in a product to sell to other people is relying on a single vendor. And with that vendor itself relying on a single vendor with whom it currently has a special relationship, a chain of single vendor reliance is formed.

Any organisation wanting to build one of these boards into their product now has to have rather a lot of confidence that the chain will never weaken or break and that at no point will either of those vendors decide that they would rather like to compete in that particular market themselves and exploit their obvious dominance in doing so. And they have to be sure that the Raspberry Pi Foundation doesn’t suddenly decide to get out of the hardware business altogether and pursue those educational objectives that they once emphasised so much instead, or that the Foundation and its manufacturing partners don’t decide for some reason to cease doing business, perhaps selectively, with people building products around their boards.

“Allwinner are GPL violators and will never get my money!”

Sadly, Allwinner have repeatedly delivered GPL-licensed software without providing the corresponding source code, and this practice may even persist to this day. One response to this has referred to the internal politics and organisation of Allwinner and that some factions try to do the right thing while others act in an unenlightened, licence-violating fashion.

Let it be known that I am no fan of the argument that there are lots of departments in companies and that just because some do some bad things doesn’t mean that you should punish the whole company. To this day, Sony does not get my business because of the unsatisfactorily-resolved rootkit scandal and I am hardly alone in taking this position. (It gets brought up regularly on a photography site I tend to visit where tensions often run high between Sony fanatics and those who use cameras from other manufacturers, but to be fair, Sony also has other ways of irritating its customers.) And while people like to claim that Microsoft has changed and is nice to Free Software, even to the point where people refusing to accept this assertion get criticised, it is pretty difficult to accept claims of change and improvement when the company pulls in significant sums from shaking down device manufacturers using dubious patent claims on Android and Linux: systems it contributed nothing to. And no, nobody will have been reading any patents to figure out how to implement parts of Android or Linux, let alone any belonging to some company or other that Microsoft may have “vacuumed up” in an acquisition spree.

So, should the argument be discarded here as well? Even though I am not too happy about Allwinner’s behaviour, there is the consideration that as the saying goes, “beggars cannot be choosers”. When very few CPUs exist that meet the criteria desirable for the initiative, some kind of nasty compromise may have to be made. Personally, I would have preferred to have had the option of the Ingenic jz4775 card that was close to being offered in the campaign, although I have seen signs of Ingenic doing binary-only code drops on certain code-sharing sites, and so they do not necessarily have clean hands, either. But they are actually making the source code for such binaries available elsewhere, however, if you know where to look. Thus it is most likely that they do not really understand the precise obligations of the software licences concerned, as opposed to deliberately withholding the source code.

But it may well be that unlike certain European, American and Japanese companies for whom the familiar regime of corporate accountability allows us to judge a company on any wrongdoing, because any executives unaware of such wrongdoing have been negligent or ineffective at building the proper processes of supervision and thus permit an unethical corporate culture, and any executives aware of such wrongdoing have arguably cultivated an unethical corporate culture themselves, it could be the case that Chinese companies do not necessarily operate (or are regulated) on similar principles. That does not excuse unethical behaviour, but it might at least entertain the idea that by supporting an ethical faction within a company, the unethical factions may be weakened or even eliminated. If that really is how the game is played, of course, and is not just an excuse for finger-pointing where nobody is held to account for anything.

But companies elsewhere should certainly not be looking for a weakening of their accountability structures so as to maintain a similarly convenient situation of corporate hypocrisy: if Sony BMG does something unethical, Sony Imaging should take the bad with the good when they share and exploit the Sony brand; people cannot have things both ways. And Chinese companies should comply with international governance principles, if only to reassure their investors that nasty surprises (and liabilities) do not lie in wait because parts of such businesses were poorly supervised and not held accountable for any unethical activities taking place.

It is up to everyone to make their own decision about this. The policy of the campaign is that the A20 can be supported by Free Software without needing any proprietary software, does not rely on any Allwinner-engineered, licence-violating software (which might be perceived as a good thing), and is merely the first step into a wider endeavour that could be conveniently undertaken with the limited resources available at the time. Later computer cards may ignore Allwinner entirely, especially if the company does not clean up its act, but such cards may never get made if the campaign fails and the wider endeavour never even begins in earnest.

(And I sincerely hope that those who are apparently so outraged by GPL violations actually support organisations seeking to educate and correct companies who commit such violations.)

“You could buy a top-end laptop for that price!”

Sure you could. But this isn’t about a crowd-funding campaign trying to magically compete with an optimised production process that turns out millions of units every year backed by a multi-billion-dollar corporation. It is about highlighting the possibilities of more scalable (down to the economically-viable manufacture of a single unit), more sustainable device design and construction. And by the way, that laptop you were talking about won’t be upgradeable, so when you tire of its performance or if the battery loses its capacity, I suppose you will be disposing of it (hopefully responsibly) and presumably buying something similarly new and shiny by today’s measures.

Meanwhile, with EOMA68, the computing part of the supposedly overpriced laptop will be upgradeable, and with sensible device design the battery (and maybe other things) will be replaceable, too. Over time, EOMA68 solutions should be competitive on price, anyway, because larger numbers of them will be produced, but unlike traditional products, the increased usable lifespans of EOMA68 solutions will also offer longer-term savings to their purchasers, too.

“You could just buy a used laptop instead!”

Sure you could. At some point you will need to be buying a very old laptop just to have a CPU without a surveillance engine and offering some level of upgrade potential, although the specification might be disappointing to you. Even worse, things don’t last forever, particularly batteries and certain kinds of electronic components. Replacing those things may well be a challenge, and although it is worthwhile to make sure things get reused rather than immediately discarded, you can’t rely on picking up a particular product in the second-hand market forever. And relying on sourcing second-hand items is very much for limited edition products, whereas the EOMA68 initiative is meant to be concerned with reliably producing widely-available products.

“Why pay more for ideological purity?”

Firstly, words like “ideology”, “religion”, “church”, and so on, might be useful terms for trolls to poison and polarise any discussion, but does anyone not see that expecting suspiciously cheap, increasingly capable products to be delivered in an almost conveyor belt fashion is itself subscribing to an ideology? One that mandates that resources should be procured at the minimum cost and processed and assembled at the minimum cost, preferably without knowing too much about the human rights abuses at each step. Where everybody involved is threatened that at any time their role may be taken over by someone offering the same thing for less. And where a culture of exploitation towards those doing the work grows, perpetuating increasing wealth inequality because those offering the services in question will just lean harder on the workers to meet their cost target (while they skim off “their share” for having facilitated the deal). Meanwhile, no-one buying the product wants to know “how the sausage is made”. That sounds like an ideology to me: one of neoliberalism combined with feigned ignorance of the damage it does.

Anyway, people pay for more sustainable, more ethical products all the time. While the wilfully ignorant may jeer that they could just buy what they regard as the same thing for less (usually being unaware of factors like quality, never mind how these things get made), more sensible people see that the extra they pay provides the basis for a fairer, better society and higher-quality goods.

“There’s no point to such modularity!”

People argue this alongside the assertion that systems are easy to upgrade and that they can independently upgrade the RAM and CPU in their desktop tower system or whatever, although they usually start off by talking about laptops, but clearly not the kind of “welded shut” laptops that they or maybe others would apparently prefer to buy (see above). But systems are getting harder to upgrade, particularly portable systems like laptops, tablets, smartphones (with Fairphone 2 being a rare exception of being something that might be upgradeable), and even upgradeable systems are not typically upgraded by most end-users: they may only manage to do so by enlisting the help of more knowledgeable relatives and friends.

I use a 32-bit system that is over 11 years old. It could have more RAM, and I could do the job of upgrading it, but guess how much I would be upgrading it to: 2GB, which is as much as is supported by the two prototyped 32-bit architecture EOMA68 computer card designs (A20 and jz4775). Only certain 32-bit systems actually support more RAM, mostly because it requires the use of relatively exotic architectural features that a lot of software doesn’t support. As for the CPU, there is no sensible upgrade path even if I were sure that I could remove the CPU without causing damage to it or the board. Now, 64-bit systems might offer more options, and in upgradeable desktop systems more RAM might be added, but it still relies on what the chipset was designed to support. Some chipsets may limit upgrades based on either manufacturer pessimism (no-one will be able to afford larger amounts in the near future) or manufacturer cynicism (no-one will upgrade to our next product if they can keep adding more RAM).

EOMA68 makes a trade-off in order to support the upgrading of devices in a way that should be accessible to people who are not experts: no-one should be dealing with circuit boards and memory modules. People who think hardware engineering has nothing to do with compromises should get out of their armchair, join one of the big corporations already doing hardware, and show them how it is done, because I am sure those companies would appreciate such market-dominating insight.

An EOMA68 computer card with the micro-desktop device

An EOMA68 computer card with the micro-desktop device (courtesy Rhombus Tech/Crowd Supply)

Back to the Campaign

But really, the criticisms are not the things to focus on here. Maybe EOMA68 was interesting to you and then you read one of these criticisms somewhere and started to wonder about whether it is a good idea to support the initiative after all. Now, at least you have another perspective on them, albeit from someone who actually believes that EOMA68 provides an interesting and credible way forward for sustainable technology products.

Certainly, this campaign is not for everyone. Above all else it is crowd-funding: you are pledging for rewards, not buying things, even though the aim is to actually manufacture and ship real products to those who have pledged for them. Some crowd-funding exercises never deliver anything because they underestimate the difficulties of doing so, leaving a horde of angry backers with nothing to show for their money. I cannot make any guarantees here, but given that prototypes have been made over the last few years, that videos have been produced with a charming informality that would surely leave no-one seriously believing that “the whole thing was just rendered” (which tends to happen a lot with other campaigns), and given the initiative founder’s stubbornness not to give up, I have a lot of confidence in him to make good on his plans.

(A lot of campaigns underestimate the logistics and, having managed to deliver a complicated technological product, fail to manage the apparently simple matter of “postage”, infuriating their backers by being unable to get packages sent to all the different countries involved. My impression is that logistics expertise is what Crowd Supply brings to the table, and it really surprises me that established freight and logistics companies aren’t dipping their toes in the crowd-funding market themselves, either by running their own services or taking ownership stakes and integrating their services into such businesses.)

Personally, I think that $65 for a computer card that actually has more RAM than most single-board computers is actually a reasonable price, but I can understand that some of the other rewards seem a bit more expensive than one might have hoped. But these are effectively “limited edition” prices, and the aim of the exercise is not to merely make some things, get them for the clique of backers, and then never do anything like this ever again. Rather, the aim is to demonstrate that such products can be delivered, develop a market for them where the quantities involved will be greater, and thus be able to increase the competitiveness of the pricing, iterating on this hopefully successful formula. People are backing a standard and a concept, with the benefit of actually getting some hardware in return.

Interestingly, one priority of the campaign has been to seek the FSF’s “Respects Your Freedom” (RYF) endorsement. There is already plenty of hardware that employs proprietary software at some level, leaving the user to merely wonder what some “binary blob” actually does. Here, with one of the software distributions for the computer card, all of the software used on the card and the policies of the GNU/Linux distribution concerned – a surprisingly awkward obstacle – will seek to meet the FSF’s criteria. Thus, the “Libre Tea” card will hopefully be one of the first general purpose computing solutions to actually be designed for RYF certification and to obtain it, too.

The campaign runs until August 26th and has over a thousand pledges. If nothing else, go and take a look at the details and the updates, with the latter providing lots of background including video evidence of how the software offerings have evolved over the course of the campaign. And even if it’s not for you, maybe people you know might appreciate hearing about it, even if only to follow the action and to see how crowd-funding campaigns are done.

Other people’s thoughts on “Freedom and security issues on x86 platforms”

Saturday, July 2nd, 2016

A couple of months ago, we had a brief discussion on the FSFE discussion mailing list about the topic of “Uncorrectable freedom and security issues on x86 platforms“, but it just came to my attention that a bunch of other people were discussing our discussion, too. Hacker News is, of course, so very “meta”, but fortunately they got onto discussing the actual topic as well.

The initial message in the original discussion advocated adopting the Power computing architecture as a primary hardware platform for Free Software. Now, the Hacker News participants were surprised that nobody mentioned SPARC and yet I was sure that SPARC did get mentioned in our discussion. A brief search doesn’t find any mention of it, however, and I’m embarrassed to admit that I do know about things like LEON and even used SPARC-based hardware for many years. (The Sun 4 workstations at my university had SPARC CPUs, for instance.)

I suppose the disconnect here involves price, availability and performance of readily-available products. Certainly, a free hardware SPARC implementation can be synthesised for an FPGA, but the previous discussion covered things like RISC-V in a similar fashion: it’s nice to have the ability to deploy a “soft processor” in an FPGA, but customers of computing products usually expect “hard” CPU performance. And you can at least buy ARM and MIPS CPUs, even if they aren’t free hardware implementations, having decent-enough performance which support Free Software from the very bottom of the software stack.

The participants in the meta-discussion wondered why MIPS became so popular given that there are licensing fees involved, whereas Sun made certain SPARC designs available under the GPL, and given that the SPARC architecture is supposedly royalty-free. For some manufacturers, this is asking the wrong question: they did not seek to license the patent-encumbered versions of the MIPS architecture; like the OpenRISC initiative, they merely implemented the unencumbered versions instead.

It would be nice to have a high-performance, inexpensive, readily-available free hardware CPU for use in free hardware designs. And of course those designs would support Free Software completely. But until that comes to pass, we have to work with what we can get. And indeed, for whichever architecture seems to be favoured for such a role, we also need to have usable and accessible hardware that is compatible with our favoured architecture so that we may prepare ourselves for the day it finally gets rolled out.

There might be a reason why SPARC isn’t so well supported by things like GNU/Linux distributions. Sadly, unlike various competitors, inexpensive SPARC products seem to be thin on the ground, and without those the efforts to maintain ports of large Free Software collections inevitably grind to a halt, but I would be only too happy for someone to point me to a source of products that I may have overlooked. There is no inherent reason why SPARC couldn’t be a viable platform for Free Software, regardless of what people may have to say about register windows!

Testing Times for Free Software and Open Hardware

Tuesday, January 12th, 2016

The last few months haven’t been too kind on Free Software and open hardware initiatives in a number of ways. Here, in a shorter form than one might usually expect from me, are some problematic developments on topics that I may have covered in the past year.

Software Freedom Undervalued

About a couple of months ago, the Software Freedom Conservancy started a fund-raising campaign after it became apparent that companies could not be relied upon to support the organisation’s activities. Since the start of the campaign, many individuals have stepped up and pledged financial support of their own, which is very generous of them, as is the support of enlightened organisations that have offered to match individual contributions.

Sadly, such generosity seems not to be shared by many of the largest companies making money from Free Software and from Linux in particular, and thus from the non-financial contributions that make projects like Linux viable in the first place, with many of those even coming from those same generous individuals who have supported the Conservancy financially. And let us consider for a moment why one prominent umbrella organisation’s members might not want to enforce the GPL, especially given that some of them have been successfully prosecuted for violating that licence, in relation to various Free Software projects, in the past.

The Proprietary Instincts of the BBC

The BBC Micro Bit was a topic covered in the last year, when I indicated a degree of caution about the mistakes of the past being repeated needlessly. And indeed, for some time, everything was being done behind the curtain of a non-disclosure agreement (NDA), meaning that very little information was being made available about the device and accompanying materials, and thus very little could be done by the average member of the public to prepare for the availability of the device, let alone develop their own materials, software, accessories or anything else for it.

Since then, a degree of secrecy has been eliminated, and efforts have been made to get the embedded variant of Python known as Micropython working on the board. However, certain parts of that work still appear to be encumbered by NDA, arguably making the effort of developing Python-related materials something of a social networking exercise. Meanwhile, notorious industry monopolist, Microsoft, somehow managed to muscle in on the initiative and take control of the principally-supported method of developing software with the device. I guess people at the BBC and their friends in politics and business don’t always learn from the mistakes of the past, particularly as they spend other people’s money.

The Walled Garden Party’s Hangover for Free Software Development

Just over twelve months ago, I made some observations about the Python core development group’s attraction to GitHub. It seems that the infatuation with the enthusiastic masses and their inevitable unleashing on Python assets, with the expectation of stimulating an exponential upturn in development activity, will now be gratified through a migration of various Python infrastructure components to the proprietary and centralised service that GitHub offers. (I have my doubts as to whether CPython contribution barriers are really the cause of Python’s current malaise, despite the usual clamour for Git and the associated “network effects” amongst a community of self-proclaimed version control wizards whose powers somehow don’t extend to mastering simple workflows with other tools.)

Anatoly Techtonik makes some interesting points, which will presumably go unheard because those involved have all decided not to listen to him any more. One of the more disturbing ones is that the “comparison shopping” mentality, where Free Software developers abandon their colleagues writing various tools and systems in favour of random corporations offering proprietary stuff at no cost, may well result in the Free Software solutions in such areas becoming seen as uncompetitive and unattractive. What those making such foolish decisions fail to realise is that their own projects can easily get the same treatment, if nobody bothers to see beyond the end of their own nose.

The result of all this is less funding and fewer resources for Free Software projects, with potentially fewer contributions, too, as the attraction of supporting “losing” solutions starts to fade. Community-oriented Free Software is arguably grossly underfunded as it is: we don’t really need other Free Software developers abandoning or undermining their colleagues while ridiculing those colleagues’ “ideological purity“. And, of course, volunteer effort will undoubtedly be burned up in the needless migration to the proprietary solution, setting everyone up for another costly transition down the road, which experience indicates is always more work than anyone anticipated (if they even bothered to think ahead at all).

PayPal: Doesn’t Pay, Not Your Pal

It has been a long time since I wrote about the Neo900 project. Things were looking promising: necessary components had been secured, and everyone was just waiting for Nikolaus to finish his work with the Pyra handheld console. And then we learned that PayPal had decided to hold a significant amount of money as a form of “security”, thus cutting off a vital source of funds for actually doing the work. Apparently, PayPal have a habit of doing this kind of thing, on one reported occasion even taking the opportunity to then offer loans to those people they deliberately put in such a difficult position.

If you supported the Neo900 project and pledged funds via PayPal, you need to tell PayPal to actually pay the project. You know: like the verb in their company name. Otherwise, in the worst case, you may not only not get a Neo900 and not see it developed to completion, but you will also have loaned your money to a large corporation for a substantial period and earned no interest on that involuntary loan, perhaps even incurring fees for the privilege. (So, please see the “How to fix it” section of the relevant article.)

Maybe in 2016, people will become a lot clearer about who their real friends are. Let us hope so!

Random Questions about Fairphone Source Code Availability

Saturday, September 26th, 2015

I was interested to read the recent announcement about source code availability for the first Fairphone device. I’ve written before about the threat to that device’s continued viability and Fairphone’s vague position on delivering a device that properly supports Free Software. It is nice to see that the initiative takes such matters seriously and does not seem to feel that letting its partners serve up what they have lying around is sufficient. However, a few questions arise, starting with the following quote from the announcement:

We can happily say that we have recently obtained a software license from all our major partners and license holders that allows us to modify the Fairphone 1 software and release new versions to our users. Getting that license also required us to obtain rights to use and distribute Mentor Graphics’s RTOS used on the phone. (We want to thank Mentor Graphics in making it possible for us to acquire the distribution license for their RTOS, as well as other partners for helping us achieve this.)

I noted before that various portions of the software are already subject to copyleft licensing, but if we ignore those (and trust that the sources were already being made available), it is interesting to consider the following questions:

  • What is “the Fairphone 1 software” exactly?
  • Fairphone may modify the software but what about its customers?
  • What role does the Mentor Graphics RTOS have? Can it be replaced by customers with something else?
  • Do the rights to use and distribute the RTOS extend to customers?
  • Do those rights extend to the source code of the RTOS, and do those rights uphold the four freedoms?

On further inspection, some contradictions emerge, perhaps most efficiently encapsulated by the following quote:

Now that Fairphone has control over the Fairphone 1 source code, what’s next? First of all, we can say that we have no plans to stop supporting the Fairphone hardware. We will continue to apply security fixes as long as it is feasible for the years to come. We will also keep exploring ways to increase the longevity of the Fairphone 1. Possibilities include upgrading to a more recent Android version, although we would like to manage expectations here as this is still very much a longshot dependent on cooperation from license holders and our own resources.

If Fairphone has control over the source code, why is upgrading to a more recent Android version dependent on cooperation with licence holders? If Fairphone “has control” then the licence holders should already have provided the necessary permissions for Fairphone to actually take control, just as one would experience with the four freedoms. One wonders which permissions have been withheld and whether these are being illegitimately withheld for software distributed under copyleft licences.

With a new device in the pipeline, I respect the persistence of Fairphone in improving the situation, but perhaps the following quote summarises the state of the industry and the struggle for sustainable licensing and licence compliance:

It is rather unusual for a small company like Fairphone to get such a license (usually ODMs get these and handle most of the work for their clients) and it is uncommon that a company attempts and manages to obtain such a license towards the end of the economic life cycle of the product.

Sadly, original design manufacturers (ODMs) have a poor reputation: often being known for throwing binaries over the wall whilst being unable or unwilling to supply the corresponding sources, with downstream manufacturers and retailers claiming that they have no leverage to rectify such licence violations. Although the injustices and hardships of those working to supply the raw materials for products like the Fairphone, along with those of the people working to construct the devices themselves, make other injustices seem slight – thinking especially of those experienced by software developers whose copyright is infringed by dubious industry practices – dealing with unethical and untidy practices wherever they may be found should be part of the initiative’s objectives.

From what I’ve seen and heard, Fairphone 2 should have a better story for chipset support and Free Software, but again, an inspection of the message raises some awkward questions. For example:

In the coming months we are going to launch several programs that address different aspects of creating fairer software. For now, one of the best tools for us to reach these goals is to embrace open source principles. With this in mind and without further ado, we’re excited to announce that we are going to release the complete build environment for Fairphone OS on Fairphone 2, which contains the full open source code, all the tools and the binary blobs that will allow users to build their own Fairphone OS.

To be fair, binary blobs are often difficult to avoid: desktop computers often use them for various devices, and even devices like the Neo900 that emphasise a completely Free Software stack will end up using them for certain functions (mitigating this by employing other technical measures). Making the build environment available is a good thing: frequently, this aspect is overlooked and anyone requesting the source code can be left guessing about build configuration details in an exercise that is effectively a matter of doing the vendor’s licence compliance work for them. But here, we are left wondering where the open source code ends, where binary blobs will be padding out the distribution, and what those blobs are actually for.

We need to keep asking difficult questions about such matters even if what Fairphone is doing is worthy in its own right. Not only does it safeguard the interests of the customers involved, but it also helps Fairphone to focus on doing the right thing. It does feel unkind to criticise what seems like a noble initiative for not doing more when they obviously try rather hard to do the right thing in so many respects. But by doing the right thing in terms of the software as well, Fairphone can uphold its own reputation and credibility: something that all businesses need to remember, as certain very large companies have very recently discovered.

Hardware Experiments with Fritzing

Friday, August 28th, 2015

One of my other interests, if you can even regard it as truly separate to my interests in Free Software and open hardware, involves the microcomputer systems of the 1980s that first introduced me to computing and probably launched me in the direction of my current career. There are many aspects of such systems that invite re-evaluation of their capabilities and limitations, leading to the consideration of improvements that could have been made at the time, as well as more radical enhancements that unashamedly employ technology that has only become available or affordable in recent years. Such “what if?” thought experiments and their hypothetical consequences are useful if we are to learn from the strategic mistakes once made by systems vendors, to have an informed perspective on current initiatives, and to properly appreciate computing history.

At the same time, people still enjoy actually using such systems today, writing new software and providing hardware that makes such continuing usage practical and sustainable. These computers and their peripherals are certainly “getting on”, and acquiring or rediscovering such old systems does not necessarily mean that you can plug them in and they still work as if they were new. Indeed, the lifetime of magnetic media and the devices that can read it, together with issues of physical decay in some components, mean that alternative mechanisms for loading and storing software have become attractive for some users, having been developed to complement or replace the cassette tape and floppy disk methods that those of us old enough to remember would have used “back in the day”.

My microcomputer of choice in the 1980s was the Acorn Electron – a cut-down, less expensive version of the BBC Microcomputer hardware platform – which supported only cassette storage in its unexpanded form. However, some expansion units added the disk interfaces present on the BBC Micro, while others added the ability to use ROM-based software. On the BBC Micro, one would plug ROM chips directly into sockets, and some expansion units for the Electron supported this method, too. The official Plus 1 expansion chose instead to support the more friendly expansion cartridge approach familiar to users of other computing and console systems, with ROM cartridges being the delivery method for games, applications and utilities in this form, providing nothing much more than a ROM chip and some logic inside a convenient-to-use cartridge.

The Motivation

A while ago, my brother, David, became interested in delivering software on cartridge for the Electron, and a certain amount of discussion led him to investigate various flash memory integrated circuits (ICs, chips), notably the AMD Am29F010 series. As technological progress continues, such devices provide a lot of storage in comparison to the ROM chips originally used with the Electron: the latter having only 16 kilobytes of capacity, whereas the Am29F010 variant chosen here has a capacity of 128 kilobytes. Meanwhile, others chose to look at EEPROM chips, notably the AT28C256 from Atmel.

Despite the manufacturing differences, both device types behave in a very similar way: a good idea for the manufacturers who could then sell products that would be compatible straight away with existing products and the mechanisms they use. In short, some kind of de-facto standard seems to apply to programming these devices, and so it should be possible to get something working with one and then switch to the other, especially if one kind becomes too difficult to obtain.

Now, some people realised that they could plug such devices into their microcomputers and program them “in place” using a clever hack where writes to the addresses that correspond to the memory provided by the EEPROM (or, indeed, flash memory device) in the computer’s normal memory map can be trivially translated into addresses that have significance to the EEPROM itself. But not routinely using such microcomputers myself, and wanting more flexibility in the programming of such devices, not to mention also avoiding the issue of getting software onto such computers so that it can be written to such non-volatile memory, it seemed like a natural course of action to try to do the programming with the help of some more modern conveniences.

And so I considered the idea of getting a microcontroller solution like the Arduino to do the programming work. Since an Arduino can be accessed over USB, a ROM image could be conveniently transferred from a modern computer and, with a suitable circuit wired up, programmed into the memory chip. ROM images can thus be obtained in the usual modern way – say, from the Internet – and then written straight to the memory chip via the Arduino, rather than having to be written first to some other medium and transferred through a more convoluted sequence of steps.


Being somewhat familiar with Arduino experimentation, the first exercise was to make the circuit that can be used to program the memory device. Here, the first challenge presented itself: the chip employs 17 address lines, 8 data lines, and 3 control lines. Meanwhile, the Arduino Duemilanove only provides 14 digital pins and 6 analogue pins, with 2 of the digital pins (0 and 1) being unusable if the Arduino is communicating with a host, and another (13) being connected to the LED and being seemingly untrustworthy. Even with the analogue pins in service as digital output pins, only 17 pins would be available for interfacing.

The pin requirements
Arduino Duemilanove Am29F010
11 digital pins (2-12) 17 address pins (A0-A16)
6 analogue pins (0-6) 8 data pins (DQ0-DQ7)
3 control pins (CE#, OE#, WE#)
17 total 28 total

So, a way of multiplexing the Arduino pins was required, where at one point in time the Arduino would be issuing signals for one purpose, these signals would then be “stored” somewhere, and then at another point in time the Arduino would be issuing signals for another purpose. Ultimately, these signals would be combined and presented to the memory device in a hopefully coherent fashion. We cannot really do this kind of multiplexing with the control signals because they typically need to be coordinated to act in a timing-sensitive fashion, so we would be concentrating on the other signals instead.

So which signals would be stored and issued later? Well, with as many address lines needing signals as there are available pins on the Arduino, it would make sense to “break up” this block of signals into two. So, when issuing an address to the memory device, we would ideally be issuing 17 bits of information all at the same time, but instead we take approximately half of the them (8 bits) and issue the necessary signals for storage somewhere. Then, we would issue the other half or so (8 bits) for storage. At this point, we need only a maximum of 8 signal lines to communicate information through this mechanism. (Don’t worry, I haven’t forgotten the other address bit! More on that in a moment!)

How would we store these signals? Fortunately, I had considered such matters before and had ordered some 74-series logic chips for general interfacing, including 74HC273 flip-flop ICs. These can be given 8 bits of information and will then, upon command, hold that information while other signals may be present on its input pins. If we take two of these chips and attach their input pins to those 8 Arduino pins we wish to use for communication, we can “program” each 74HC273 in turn – one with 8 bits of an address, the other with another 8 bits – and then the output pins will be presenting 16 bits of the address to the memory chip. At this point, those 8 Arduino pins could even be doing something else because the 74HC273 chips will be holding the signal values from an earlier point in time and won’t be affected by signals presented to their input pins.

Of all the non-control signals, with 16 signals out of the way, that leaves only 8 signals for the memory chip’s data lines and that other address signal to deal with. But since the Arduino pins used to send address signals are free once the addresses are sent, we can re-use those 8 pins for the data signals. So, with our signal storage mechanism, we get away with only using 8 Arduino pins to send 24 pieces of information! We can live with allocating that remaining address signal to a spare Arduino pin.

Address and data pins
Arduino Duemilanove 74HC273 Am29F010
8 input/output pins 8 output pins 8 address pins (A0-A7)
8 output pins 8 address pins (A8-A15)
8 data pins (DQ0-DQ7)
1 output pin 1 address pin (A16)
9 total 25 total

That now leaves us with the task of managing the 3 control signals for the memory chip – to make it “listen” to the things we are sending to it – but at the same time, we also need to consider the control lines for those flip-flop ICs. Since it turns out that we need 1 control signal for each of the 74HC273 chips, we therefore need to allocate 5 additional interfacing pins on the Arduino for sending control signals to the different chips.

The final sums
Arduino Duemilanove 74HC273 Am29F010
8 input/output pins 8 output pins 8 address pins (A0-A7)
8 output pins 8 address pins (A8-A15)
8 data pins (DQ0-DQ7)
1 output pin 1 address pin (A16)
3 output pins 3 control pins (CE#, OE#, WE#)
2 output pins 2 control pins (CP for both ICs)
14 total 28 total

In the end, we don’t even need all the available pins on the Arduino, but the three going spare wouldn’t be enough to save us from having to use the flip-flop ICs.

With this many pins in use, and the need to connect them together, there are going to be a lot of wires in use:

The breadboard circuit with the Arduino and ICs

The breadboard circuit with the Arduino and ICs

The result is somewhat overwhelming! Presented in a more transparent fashion, and with some jumper wires replaced with breadboard wires, it is slightly easier to follow:

An overview of the breadboard circuit

An overview of the breadboard circuit

The orange wires between the two chips on the right-hand breadboard indicate how the 8 Arduino pins are connected beyond the two flip-flop chips and directly to the flash memory chip, which would sit on the left-hand breadboard between the headers inserted into that breadboard (which weren’t used in the previous arrangement).

Making a Circuit Board

It should be pretty clear that while breadboarding can help a lot with prototyping, things can get messy very quickly with even moderately complicated circuits. And while I was prototyping this, I was running out of jumper wires that I needed for other things! Although this circuit is useful, I don’t want to have to commit my collection of components to keeping it available “just in case”, but at the same time I don’t want to have to wire it up when I do need it. The solution to this dilemma was obvious: I should make a “proper” printed circuit board (PCB) and free up all my jumper wires!

It is easy to be quickly overwhelmed when thinking about making circuit boards. Various people recommend various different tools for designing them, ranging from proprietary software that might be free-of-charge in certain forms but which imposes arbitrary limitations on designs (as well as curtailing your software freedoms) through to Free Software that people struggle to recommend because they have experienced stability or functionality deficiencies with it. And beyond the activity of designing boards, the act of getting them made is confused by the range of services in various different places with differing levels of service and quality, not to mention those people who advocate making boards at home using chemicals that are, shall we say, not always kind to the skin.

Fortunately, I had heard of an initiative called Fritzing some time ago, initially in connection with various interesting products being sold in an online store, but whose store then appeared to be offering a service – Fritzing Fab – to fabricate individual circuit boards. What isn’t clear, or wasn’t really clear to me straight away, was that Fritzing is also some Free Software that can be used to design circuit boards. Conveniently, it is also available as a Debian package.

The Fritzing software aims to make certain tasks easy that would perhaps otherwise require a degree of familiarity with the practice of making circuit boards. For instance, having decided that I wanted to interface my circuit to an Arduino as a shield which sits on top and connects directly to the connectors on the Arduino board, I can choose an Arduino shield PCB template in the Fritzing software and be sure that if I then choose to get the board made, the dimensions and placement of the various connections will all be correct. So for my purposes and with my level of experience, Fritzing seems like a reasonable choice for a first board design.

Replicating the Circuit

Fritzing probably gets a certain degree of disdain from experienced practitioners of electronic design because it seems to emphasise the breadboard paradigm, rather than insisting that a proper circuit diagram (or schematic) acts as the starting point. Here is what my circuit looks like in Fritzing:

The breadboard view of my circuit in Fritzing

The breadboard view of my circuit in Fritzing

You will undoubtedly observe that it isn’t much tidier than my real-life breadboard layout! Having dragged a component like the Arduino Uno (mostly compatible with the Duemilanove) onto the canvas along with various breadboards, and then having dragged various other components onto those breadboards, all that remains is that we wire them up like we managed to do in reality. Here, Fritzing helps out by highlighting connections between things, so that breadboard columns appear green as wires are connected to them, indicating that an electrical connection is made and applies to all points in that column on that half of the breadboard (the upper or lower half as seen in the above image). It even highlights things that are connected together according to the properties of the device, so that any attempt to modify to a connection that leads to one of the ground pins on the Arduino also highlights the other ground pins as the modification is being done.

I can certainly understand criticism of this visual paradigm. Before wiring up the real-life circuit, I typically write down which things will be connected to each other in a simple table like this:

Example connections
Arduino 74HC273 #1 74HC273 #2 Am29F010
A5 CE#
A4 OE#
A3 WE#
2 CP
3 CP
4 D3 D3 DQ3

If I were not concerned with prototyping with breadboards, I would aim to use such information directly and not try and figure out which size breadboard I might need (or how many!) and how to arrange the wires so that signals get where they need to be. When one runs out of points in a breadboard column and has to introduce “staging” breadboards (as shown above by the breadboard hosting only incoming and outgoing wires), it distracts from the essential simplicity of a circuit.

Anyway, once the circuit is defined, and here it really does help that upon clicking on a terminal/pin, the connected terminals or pins are highlighted, we can move on to the schematic view and try and produce something that makes a degree of sense. Here is what that should look like in Fritzing:

The schematic for the circuit in Fritzing

The schematic for the circuit in Fritzing

Now, the observant amongst you will notice that this doesn’t look very tidy at all. First of all, there are wires going directly between terminals without any respect for tidiness whatsoever. The more observant will notice that some of the wires end in the middle of nowhere, although on closer inspection they appear to be aimed at a pin of an IC but are shifted to the right on the diagram. I don’t know what causes this phenomenon, but it would seem that as far as the software is concerned, they are connected to the component. (I will come back to how components are defined and the pitfalls involved later on.)

Anyway, one might be tempted to skip over this view and try and start designing a PCB layout directly, but I found that it helped to try and tidy this up a bit. First of all, the effects of the breadboard paradigm tend to manifest themselves with connections that do not really reflect the logical relationships between components, so that an Arduino pin that feeds an input pin on both flip-flop ICs as well as a data pin on the flash memory IC may have its connectors represented by a wire first going from the Arduino to one of the flip-flop ICs, then to the other flip-flop IC, and finally to the flash memory IC in some kind of sequential wiring. Although electrically this is not incorrect, with a thought to the later track routing on a PCB, it may not be the best representation to help us think about such subsequent problems.

So, for my own sanity, I rearranged the connections to “fan out” from the Arduino as much as possible. This was at times a frustrating exercise, as those of you with experience with drawing applications might recognise: trying to persuade the software that you really did select a particular thing and not something else, and so on. Again, selecting the end of a connection causes some highlighting to occur, and the desired result is that selecting a terminal highlights the appropriate terminals on the various components and not the unrelated ones.

Sometimes that highlighting behaviour provides surprising and counter-intuitive results. Checking the breadboard layout tends to be useful because Fritzing occasionally thinks that a new connection between certain pins has been established, and it helpfully creates a “rats nest” connection on the breadboard layout without apparently saying anything. Such “rats nest” connections are logical connections that have not been “made real” by the use of a wire, and they feature heavily in the PCB view.

PCB Layout

For those of us with no experience of PCB layout who just admire the PCBs in everybody else’s products, the task of laying out the tracks so that they make electrical sense is a daunting one. Fritzing will provide a canvas containing a board and the chosen components, but it is up to you to combine them in a sensible way. Here, the circuit board actually corresponds to the Arduino in the breadboard and schematic views.

But slightly confusing as the depiction of the Arduino is in the breadboard view, the pertinent aspects of it are merely the connectors on that device, not the functionality of the device itself which we obviously aren’t intending to replicate. So, instead of the details of an actual Arduino or its functional equivalent, we instead merely see the connection points required by the Arduino. And by choosing a board template for an Arduino shield, those connection points should appear in the appropriate places, as well as the board itself having the appropriate size and shape to be an Arduino shield.

Here’s how the completed board looks:

The upper surface of the PCB design in Fritzing

The upper surface of the PCB design in Fritzing

Of course, I have spared you a lot of work by just showing the image above. In practice, the components whose outlines and connectors feature above need to be positioned in sensible places. Then, tracks need to be defined connecting the different connection points, with dotted “rats nest” lines directly joining logically-connected points needing to be replaced with physical wiring in the form of those tracks. And of course, tracks do not enjoy the same luxury as the wires in the other views, of being able to cross over each other indiscriminately: they must be explicitly routed to the other side of the board, either using the existing connectors or by employing vias.

The lower surface of the PCB design in Fritzing

The lower surface of the PCB design in Fritzing

Hopefully, you will get to the point where there are no more dotted lines and where, upon selecting a connection point, all the appropriate points light up, just as we saw when probing the details of the other layouts. To reassure myself that I probably had connected everything up correctly, I went through my table and inspected the pin-outs of the components and did a kind of virtual electrical test, just to make sure that I wasn’t completely fooling myself.

With all this done, there isn’t much more to do before building up enough courage to actually get a board made, but one important step that remains is to run the “design checks” via the menu to see if there is anything that would prevent the board from working correctly or from otherwise being made. It can be the case that tracks do cross – the maze of yellow and orange can be distracting – or that they are too close and might cause signals to go astray. Fortunately, the hours of planning paid off here and only minor adjustments needed to be done.

It should be noted that the exercise of routing the tracks is certainly not to be underestimated when there are as many connections as there are above. Although an auto-routing function is provided, it failed to suggest tracks for most of the required connections and produced some bizarre routing as well. But clinging onto the memory of a working circuit in real three-dimensional space, along with the hope that two sides of a circuit board are enough and that there is enough space on the board, can keep the dream of a working design alive!

The Components

I skipped over the matter of components earlier on, and I don’t really want to dwell on the matter too much now, either. But one challenge that surprised me given the selection of fancy components that can be dragged onto the canvas was the lack of a simple template for a 32-pin DIP (dual in-line package) socket for the Am29F010 chip. There were socket definitions of different sizes, but it wasn’t possible to adjust the number of pins.

Now, there is a parts editor in Fritzing, but I tend to run away from graphical interfaces where I suspect that the matter could be resolved in more efficient ways, and it seems like other people feel the same way. Alongside the logical definition of the component’s connectors, one also has to consider the physical characteristics such as where the connectors are and what special markings will be reproduced on the PCB’s silk-screen for the component.

After copying an existing component, ransacking the Fritzing settings files, editing various files including those telling Fritzing about my new parts, I achieved my modest goals. But I would regard this as perhaps the weakest part of the software. I didn’t resort to doing things the behind-the-scenes way immediately, but the copy-and-edit paradigm was incredibly frustrating and doesn’t seem to be readily documented in a way I could usefully follow. There is a Sparkfun tutorial which describes things at length, but one cannot help feeling that a lot of this should be easier, especially for very simple component changes like the one I needed.

The Result

With some confidence and only modest expectations of success, I elected to place an order with the Fritzing Fab service and to see what the result would end up like. This was straightforward for the most part: upload the file created by Fritzing, fill out some details (albeit not via a secure connection), and then proceed to payment. Unfortunately, the easy payment method involves PayPal, and unfortunately PayPal wants random people like myself to create an account with them before they will consider letting me make a credit card payment, which is something that didn’t happen before. Fortunately, the Fritzing people are most accommodating and do support wire transfers as an alternative payment method, and they were very responsive to my queries, so I managed to get an order submitted even more quickly than I thought might happen (considering that fabrication happens only once a week).

Just over a week after placing my order, the board was shipped from Germany, arriving a couple of days later here in Norway. Here is what it looked like:

The finished PCB from Fritzing

The finished PCB from Fritzing

Now, all I had to do was to populate the board and to test the circuit again with the Arduino. First, I tested the connections using the Arduino’s 5V and GND pins with an LED in series with a resistor in an “old school” approach to the problem, and everything seemed to be as I had defined it in the Fritzing software.

Given that I don’t really like soldering things, the act of populating the board went about as well as expected, even though I could still clean up the residue from the solder a bit (which would lead me onto a story about buying the recommended chemicals that I won’t bother you with). Here is the result of that activity:

The populated board together with the Arduino

The populated board together with the Arduino

And, all that remained was the task of getting my software running and testing the circuit in its new form. Originally, I was only using 16 address pins, holding the seventeenth low, and had to change the software to handle these extended addresses. In addition, the issuing of commands to the flash memory device probably needed a bit of refinement as well. Consequently, this testing went on for a bit longer than I would have wished, but eventually I managed to successfully replicate the programming of a ROM image that had been done some time ago with the breadboard circuit.

The outcome did rely on a certain degree of good fortune: the template for the Arduino Uno is not quite compatible with the Duemilanove, but this was rectified by clipping two superfluous pins from one of the headers I soldered onto the board; two of the connections belonging to the socket holding the flash memory chip touch the outside of the plastic “power jack” socket, but not enough to cause a real problem. But I would like to think that a lot of preparation dealt with problems that otherwise might have occurred.

Apart from liberating my breadboards and wires, this exercise has provided useful experience with PCB design. And of course, you can find the sources for all of this in my repository, as well as a project page for the board on the Fritzing projects site. I hope that this account of my experiences will encourage others to consider trying it out, too. It isn’t as scary as it would first appear, after all, although I won’t deny that it was quite a bit of work!

New Fairphone, New Features, Same Old Software Story?

Saturday, August 15th, 2015

I must admit that I haven’t been following Fairphone of late, so it was a surprise to see that vague details of the second Fairphone device have been published on the Fairphone Web site. One aspect that seems to be a substantial improvement is that of hardware modularity. Since the popularisation of the notion that such a device could be built by combining functional units as if they were simple building blocks, with a lot of concepts, renderings and position statements coming from a couple of advocacy initiatives, not much else has actually happened in terms of getting devices out for people to use and develop further. And there are people with experience of designing such end-user products who are sceptical about the robustness and economics of such open-ended modular solutions. To see illustrations of a solution that will presumably be manufactured takes the idea some way along the road to validation.

If it is possible to, say, switch out the general-purpose computing unit of the Fairphone with another one, then it can be said that even if the Fairphone initiative fails once again to deliver a software solution that is entirely Free Software, perhaps because the choice of hardware obliges the initiative to deliver opaque “binary-only” payloads, then the opportunity might be there for others to deliver a bottom-to-top free-and-open solution as a replacement component. But one might hope that it should not be necessary to “opt in” to getting a system whose sources can be obtained, rebuilt and redeployed: that the second Fairphone device might have such desirable characteristics out of the box.

Now, it does seem that Fairphone acknowledges the existence and the merits of Free Software, at least in very broad terms. Reading the support site provides us with an insight into the current situation with regard to software freedom and Fairphone:

Our goal is to take a more open source approach to be able to offer owners more choice and control over their phone’s OS. For example, we want to make the source code available to the developer community and we are also in discussions with other OS vendors to look at the possibility of offering alternative operating systems for the Fairphone 2. However, at the moment there are parts of the software that are owned or licensed by third parties, so we are still investigating the technical and legal requirements to accomplish our goals of open software.

First of all, ignoring vague terms like “open software” that are susceptible to “openwashing” (putting the label “open” on something that really isn’t), it should be noted that various parts of the deployed software will, through their licensing, oblige the Fairphone initiative to make the corresponding source code available. This is not a matter that can be waved away with excuses about people’s hands being tied, that it is difficult to coordinate, or whatever else the average GPL-violating vendor might make. If copyleft-licensed code ships, the sources must follow.

Now there may also be proprietary software on the device (or permissively-licensed software bearing no obligation for anyone to release the corresponding source, which virtually amounts to the same thing) and that would clearly be against software freedom and should be something Fairphone should strongly consider avoiding, because neither end-users nor anyone who may wish to help those users would have any control over such software, and they would be completely dependent on the vendor, who in turn would be completely dependent on their suppliers, who in turn might suddenly not care about the viability of that software or the devices on which it is deployed. So much for sustainability under such circumstances!

As I noted before, having control over the software is not a perk for those who wish to “geek out” over the internals of a product: it is a prerequisite for product viability, longevity and sustainability. Let us hope that Fairphone can not only learn and apply the lessons from their first device, which may indeed have occurred with the choice of a potentially supportable chipset this time around, but that the initiative can also understand and embrace their obligations to those who produced the bulk of their software (as well as to their customers) in a coherent and concrete fashion. It would be a shame if, once again, an unwillingness to focus on software led to another missed opportunity, and the need for another version of the device to be brought to market to remedy deficiencies in what is otherwise a well-considered enterprise.

Now, if only Fairphone could organise their Web site in a more coherent fashion, putting useful summaries of essential information in obvious places instead of being buried in some random forum post

EOMA-68: The Return

Wednesday, April 8th, 2015

It is hard to believe that almost two years have passed since I criticised the Ubuntu Edge crowd-funding campaign for being a distraction from true open hardware initiatives (becoming one which also failed to reach its funding target, but was presumably good advertising for Ubuntu’s mobile efforts for a short while). Since then, the custodians of Ubuntu have pressed on with their publicity stunts, the most recent of which involving limited initial availability of an Ubuntu-branded smartphone that may very well have been shipping without the corresponding source code for the GPL-licensed software being available, even though it is now claimed that this issue has been remedied. Given the problems with the same chipset vendor in other products, I personally cannot help feeling that the matter might need more investigation, but then again, I personally do not have time to chase up licence compliance in other people’s products, either.

Meanwhile, some genuine open hardware initiatives were mentioned in that critique of Ubuntu’s mobile strategy: GTA04 is the continuing effort to produce a smartphone that continues the legacy of the Openmoko Neo FreeRunner, whose experiences are now helping to produce the Neo900 evolution of the Nokia N900 smartphone; Novena is an open hardware laptop that was eventually successfully crowd-funded and is in the process of shipping to backers; OpenPandora is a handheld games console, the experiences from which have since been harnessed to initiate the DragonBox Pyra product with a very similar physical profile and target audience. There is a degree of collaboration and continuity within some of these projects, too: the instigator of the GTA04 project is assisting with the Neo900 and the Pyra, for example, partly because these projects use largely the same hardware platform. And, of course, GNU/Linux is the foundation of the software for all this hardware.

But in general, open hardware projects remain fairly isolated entities, perhaps only clustering into groups around particular chipsets or hardware platforms. And when it comes to developing a physical device, the amount of re-use and sharing between projects is perhaps less than we might have come to expect from software, particularly Free Software. Not that this has necessarily slowed the deluge of boards, devices, products and crowd-funding campaigns: everywhere you look, there’s a new Arduino variant or something claiming to be the next big thing in the realm of the “Internet of Things” (IoT), but after a while one gets the impression that it is the same thing being funded and sold, over and over again, with the audience probably not realising that it has all mostly been done before.

The Case for Modularity

Against this backdrop, there is one interesting and somewhat unusual initiative that I have only briefly mentioned before: the development of the EOMA-68 (Embedded Open Modular Architecture 68) standard along with products to demonstrate it. Unlike the average single-board computer or system-on-module board, EOMA-68 attempts to define a widely-used modular computing unit which is also a complete computing device, delegating input (keyboards, mice, storage) and output (displays) to other devices. It has often been repeated that today phones are just general-purpose computers that happen to be able to make calls, and the same can be said for a lot of consumer electronics equipment that traditionally were either much simpler devices or which only employed special-purpose computing units to perform their work: televisions are a reasonably illustrative example of this.

And of course, computers as we know them come in all shapes and sizes now: phones, media players, handhelds, tablets, netbooks, laptops, desktops, workstations, and so on. But most of these devices are not built to be upgraded when the core computing part of them becomes obsolete or, at the very least, less attractive than the computing features of newer devices, nor can the purchaser mix and match the computing part of one device with the potentially more attractive parts of another: one kind of smart television may have a much better screen but a poorer user interface that one would want to replace, for example. There are workarounds – some people use USB-based “plug computers” to give their televisions “smart” capabilities – but when you buy a device, you typically have to settle for the bundled software and computing hardware (even if the software might eventually be modifiable thanks to the role of the GPL, subject to constraints imposed by manufacturers that might prevent modification).

With a modular computing unit, the element of choice is obviously enhanced, but it also helps those developing open hardware. First of all, the interface to the computing unit is well-defined, meaning that the designers of a device need not be overly concerned with the way the general-purpose computing functionality is to be provided beyond the physical demands of that particular module and the facilities provided by it. Beyond such constraints, being able to rely on a tested functional element, designers can focus on the elements of their device that differentiate it from other devices without having to master the integration of their own components of interest with those required for the computing functionality in one “make or break” hardware design that might prove too demanding to get right first time (or even second or third time). Prototyping complicated circuit designs can quickly incur considerable costs, and eliminating complexity from what might be described as the “peripheral board” – the part providing the input and output capabilities and the character of a particular device – not only reduces the risk of getting things wrong, but it could make the production of that board cheaper, too. And that might open up device design to a broader group of participants.

As Nico Rikken explains, EOMA-68 promises to offer benefits for hardware designers, software developers and customers. Modularity does make sense if properly considered, which is perhaps why other modularity initiatives like Phonebloks have plenty of critics even though they share the same worthy objectives of reducing waste and avoiding device obsolescence: with vague statements about modularity and the hint of everything being interchangeable and interoperating with everything, one cannot help be skeptical about the potential complexity and interoperability problems that could result, not to mention the ergonomic issues that most people can easily relate to. By focusing on the general-purpose computing aspect of modularity, EOMA-68 addresses the most important part of the hardware for Free Software and delegates modularity elsewhere in the system to other initiatives that do not claim to do it all.

A Few False Starts

Unfortunately, not everything has gone precisely according to schedule with EOMA-68 so far. After originally surfacing as part of an initiative to make a competitive ARM-based netbook, the plan was to make computing modules and “engineering boards” on the way to delivering a complete product, and the progress of the first module can be followed on the Allwinner A10 news page on the Rhombus Tech Web site. From initial interest from various parties at the start of 2012, and through a considerable amount of activity, by May 2013, working A10 boards were demonstrated running Debian Wheezy. And a follow-up board employing the Allwinner A20 instead of the A10 was demonstrated running Debian at the end of October 2014 as part of a micro-desktop solution.

One might have thought that these devices would be more widely available by now, particularly as development began in 2012 on a tablet board to complement the computing modules, with apparently steady progress being made. Now, the development of this tablet was driven by the opportunity to collaborate with the Vivaldi tablet project, whose own product had been rendered unusable for Free Software usage by the usual product iteration performed behind the scenes by the contract manufacturer changing the components in use without notice (as is often experienced by those buying computers to run Free Software operating systems, only to discover that the wireless chipset, say, is no longer one that is supported by Free Software). With this increased collaboration with KDE-driven hardware initiatives (Improv and Vivaldi), efforts seemingly became directed towards satisfying potential customers within the framework of those other initiatives, so that to acquire the micro-engineering board one would seek to purchase an Improv board instead, and to obtain a complete tablet product one would place an advance order for the Vivaldi tablet instead of anything previously under development.

Somehow during 2014, the collaboration between the participants in this broader initiative appears to have broken down, with there undoubtedly being different perspectives on the sequence of events that led to the cancellation of Improv and Vivaldi. Trawling the mailing list archives gives more detail but not much more clarity, and it can perhaps only be said that mistakes may have been made and that everybody learned new things about certain aspects of doing business with other people. The effect, especially in light of the deluge of new and shiny products for casual observers to purchase instead of engaging in this community, and with many people presumably being told that their Vivaldi tablet would not be shipping after all, probably meant that many people lost interest and, indeed, hope that there would be anything worth holding out for.

The Show Goes On

One might have thought that such a setback would have brought about the end of the initiative, but its instigator shows no sign of quitting, probably because genuine hardware has been made, and other opportunities and collaborations have been created on the way. Initially, the focus was on an ARM-based netbook or tablet that would run Free Software without the vendor neglecting to provide the complete corresponding source for things like the Linux kernel and bootloader required to operate the device. This requirement for licence compliance has not disappeared or diminished, with continuing scrutiny placed on vendors to make sure that they are not just throwing binaries over the wall.

But as experience was gained in evaluating suitable CPUs, it was not only ARM CPUs that were found to have the necessary support characteristics for software freedom as well as for low power consumption. The Ingenic jz4775, a sibling of the rather less capable jz4720 used by the Ben NanoNote, uses the MIPS architecture and may well be fully supported by the mainline Linux kernel in the near future; the ICubeCorp IC1T is a more exotic CPU that can be supported by Free Software toolchains and should be able to run the Linux kernel in addition to Android. Alongside these, the A20 remains the most suitable of the options from Allwinner, whose products have always been competitively priced (which has also been a consideration), but there are other ARM derivatives that would be more interesting from a vendor cooperation perspective, notably the TI AM389x series of CPUs.

Meanwhile, after years of questions about whether a crowd-funding campaign would be started to attract customers and to get the different pieces of hardware built in quantity, plans for such a campaign are now underway. While initial calls for a campaign may have been premature, I now think that the time is right: people have been using the hardware already produced for some time, and considerable experience has been amassed along the way up to this point; the risks should be substantially lower than quite a few other crowd-funding campaigns that seem to be approved and funded these days. Not that anyone should seek to conceal the nature of crowd-funding and the in-built element of risk associated with such campaigns, of course: it is not the same as buying a product from a store.

Nevertheless, I would be very interested to see this hardware being made, and I am even on record as having said so. Part of this is selfishness: I could do with some newer, quieter, less power-consuming hardware. But I also think that a choice of different computing modules, supporting Free Software operating systems out of the box, with some of them candidates for FSF endorsement, and offering a diversity of architectures, would be beneficial to a sustainable computing infrastructure in the longer term. If you also think so, maybe you should follow the progress of EOMA-68 over the coming weeks and months, too.

Open Hardware and Free Software: Not Just For The Geeks

Saturday, April 4th, 2015

Having seen my previous article about the Fairphone initiative’s unfortunate choice of technologies mentioned in various discussions about the Fairphone, I feel a certain responsibility to follow up on some of the topics and views that tend to get aired in these discussions. In response to an article about an “open operating system” for the Fairphone, a rather solid comment was made about how the initiative still seems to be approaching the problem from the wrong angle.

Because the article comments have been delegated to a proprietary service that may at some point “garbage-collect” them from the public record, I reproduce the comment here (and I also expanded the link previously provided by a link-shortening service for similar and other reasons):

You are having it all upside down.
Just make your platform open instead of using proprietary chipsets with binary blobs! Then porting Firefox OS to the Fairphone would be easy as pie.

Not listening to the people who said that only free software running on open hardware would be really fair is exactly what brought you this mess: Our approach to software and ongoing support for the first Fairphones
It is also why I advised all of my friends and acquaintances not to order a Fairphone until it becomes a platform that respects user freedom. Turns out I was more than right.
If the Fairphone was an open platform that could run Firefox OS, Replicant or pure Debian, I would tell everybody in need of a cellphone to buy one.

I don’t know the person who wrote this comment, but it is very well-formulated, and one wouldn’t think that there would be much to add. Unfortunately, some people seem to carry around their own misconceptions about some of the concepts mentioned above, and unfortunately, they are quite happy to propagate those misconceptions as if they were indisputable facts. Below, I state the real facts in the headings and quote each one of the somewhat less truthful misconceptions for further scrutiny.

Open Hardware and Free Software is for Everyone

Fairphone should not make the mistake of producing a phone for geeks. Instead, it should become a phone for everyone.

Just because people have an opinion about technology and wish to see certain guarantees made about the nature of that technology does not mean that the result is “for geeks”. In fact, making the hardware open means that more people can figure things out about it, improve it, understand it, and improve the way it works and the software that uses it. Making the software truly open means that more people can change it, fix it, enhance it, and extend the usable life of the device. All of this benefits everyone, whereas closed hardware and proprietary software ultimately benefit only the small groups of people who respectively designed the device and wrote the software, both of whom being very likely to lose interest in sustaining the life of that product as soon as they have another one they want to sell you. (And often, in the case of the hardware, as soon as it leaves the factory.)

User Freedom Means Exactly User Freedom

‘User freedom’ is often used when actually ‘developers freedom’ is meant. It is more of an ideology.

Incorrect! Those of us who use the term Free Software know exactly what we mean: it is the freedom of the end-user to exercise precisely those privileges that have resulted in the work being produced and delivered to them. Now, there are people who advocate “permissive licences” that do favour developers in that they allow people to use the work of others and to then provide a piece of software under conditions that grants the end-user only limited privileges, taking away those privileges to see how the entire work is constructed, along with those that allow the entire work to be improved and shared. Whether one sees either of these as an ideology, presumably emphasising one’s own “pragmatism” in contrast, is largely irrelevant because the genuine pragmatism involved in Free Software and the propagation of a broader set of privileges actually delivers sustainability: users – genuine end-users, not middle-men – get the freedom to participate in how the product turns out, and crucially, how it lives on after the original producer has decided to go off and do something else.

Openness Does Not Preclude Fanciness (But Security Requires Openness)

What people want is: user friendly interface, security/privacy, good specs and ability to install apps and games. [...] OpenSource is a nice idea, but has its disadvantages too: who is caring about quality?

It’s just too easy for people to believe claims about privacy and security, even after everybody found out that they were targets of widespread surveillance, even after various large corporations who presumably care about their reputations have either lost the personal details of their users to criminals or have shared those details with others (who also have criminal or unethical intent), and when believing the sales-pitch about total privacy and robust security, those people will happily reassure themselves and others that no company would allow its reputation to be damaged by any breach of privacy or security! But there are no guarantees of security or privacy if you cannot trust the systems you use, and there is no way of trusting them without being able to inspect how they work. More than ever, people need genuine guarantees of security and privacy – not reassurances from salesmen and advertisers – and the best way to start off on the path towards such guarantees is to be able to deploy Free Software on a device that you fully control.

And as for quality, user-friendliness and all the desirable stuff: how many people use products like Firefox in its various forms every single day? Such Free Software solutions have not merely set the standard over the years, but they have kept technologies like the Web relevant and viable, in stark contrast to proprietary bundled programs like Internet Explorer that have actually impaired technological and social progress, with “IE” doing its bit by exhibiting a poor record of adherence to standards and a continuous parade of functionality and security bugs, not to mention constant usability frustrations endured by its unfortunate (and frequently involuntary) audience of users.

Your Priorities Make Free Software Important

I found the following comment to be instructive:

For me open source isn’t important. My priorities are longevity/updates, support, safety/privacy.

The problem is this: how can you guarantee longevity, updates, support, safety and privacy without openness? Safety and privacy would require you to have blind trust in someone whose claims you cannot verify. Longevity, updates and support require you to rely on the original producer’s continued interest in the product that you have just purchased from them, and should it become more profitable for them to focus on other products (that they might want you to buy instead of continuing to use the one you have), you might be able to rely on the goodwill of that producer to transfer their responsibilities to others to do the thankless tasks of maintenance and support. But it may well be the case that no amount of money will be able to keep that product viable for you: the producer may simply refuse to support it or to let others support it. Perhaps some people may step in and reverse-engineer the product and make an effort to keep it viable, but wouldn’t it be better to have an open product to start with, where people can choose how it is maintained – and thus sustained – for as long as people still want to use it?

Concepts like open hardware and Free Software sound like topics for the particularly-interested, but they provide the foundations for those topics of increasing interest and attention that people claim to care so much about. Everybody deserves things like choice, democracy, privacy, security, safety, control over their own lives and destinies, and so on. Closed hardware and proprietary software may be used on lots of devices, and people may be getting a lot of use out of those devices, but the users of those devices enjoy the benefits only as long as it remains in the interests of the producers of those devices and the accompanying software to allow them to do so. Furthermore, few or none of those users can be sure whether any of those important things – their rights – are being impaired by their use of those devices. Are their communications being intercepted, collected, analysed? Few people would ever know.

Free Software and open hardware empower their users with the control that proprietary technologies deny their users. But shouldn’t everybody be able to benefit from such control? That’s why a device that is open hardware and which runs Free Software really is for everyone, not just for “geeks”.

The BBC Micro and the BBC Micro Bit

Sunday, March 22nd, 2015

At least on certain parts of the Internet as well as in other channels, there has been a degree of excitement about the announcement by the BBC of a computing device called the “Micro Bit“, with the BBC’s plan to give one of these devices to each child starting secondary school, presumably in September 2015, attracting particular attention amongst technology observers and television licence fee-payers alike. Details of the device are a little vague at the moment, but the announcement along with discussions of the role of the corporation and previous initiatives of this nature provides me with an opportunity to look back at the original BBC Microcomputer, evaluate some of the criticisms (and myths) around the associated Computer Literacy Project, and to consider the things that were done right and wrong, with the latter hopefully not about to be repeated in this latest endeavour.

As the public record reveals, at the start of the 1980s, the BBC wanted to engage its audience beyond television programmes describing the growing microcomputer revolution, and it was decided that to do this and to increase computer literacy generally, it would need to be able to demonstrate various concepts and technologies on a platform that would be able to support the range of activities to which computers were being put to use. Naturally, a demanding specification was constructed – clearly, the scope of microcomputing was increasing rapidly, and there was a lot to demonstrate – and various manufacturers were invited to deliver products that could be used as this reference platform. History indicates that a certain amount of acrimony followed – a complete description of which could fill an entire article of its own – but ultimately only Acorn Computers managed to deliver a machine that could do what the corporation was asking for.

An Ambitious Specification

It is worth considering what the BBC Micro was offering in 1981, especially when considering ill-informed criticism of the machine’s specifications by people who either prefer other systems or who felt that participating in the development of such a machine was none of the corporation’s business. The technologies to be showcased by the BBC’s programme-makers and supported by the materials and software developed for the machine included full-colour graphics, multi-channel sound, 80-column text, Viewdata/Teletext, cassette and diskette storage, local area networking, interfacing to printers, joysticks and other input/output devices, as well as to things like robots and user-developed devices. Although it is easy to pick out one or two of these requirements, move forwards a year or two, increase the budget two- or three-fold, or any combination of these things, and to nominate various other computers, there really were few existing systems that could deliver all of the above, at least at an affordable price at the time.

Some microcomputers of the early 1980s
Computer RAM Text Graphics Year Price
Apple II Plus Up to 64K 40 x 25 (upper case only) 280 x 192 (6 colours), 40 x 48 (16 colours) 1979 £1500 or more
Commodore PET 4032/8032 32K 40/80 x 25 Graphics characters (2 colours) 1980 £800 (4032), £1030 (8032) (including monochrome monitor)
Commodore VIC-20 5K 22 x 23 176 x 184 (8 colours) 1980 (1981 outside Japan) £199
IBM PC (Model 5150) 16K up to 256K 40/80 x 25 640 x 200 (2 colours), 320 x 200 (4 colours) 1981 £1736 (including monochrome monitor, presumably with 16K or 64K)
BBC Micro (Model B) 32K 80/40/20 x 32/24, Teletext 640 x 256 (2 colours), 320 x 256 (2/4 colours), 160 x 256 (4/8 colours) 1981 £399 (originally £335)
Research Machines LINK 480Z 64K (expandable to 256K) 40 x 24 (optional 80 x 24) 160 x 72, 80 x 72 (2 colours); expandable to 640 x 192 (2 colours), 320 x 192 (4 colours), 190 x 96 (8 colours or 16 shades) 1981 £818
ZX Spectrum 16K or 48K 32 x 24 256 x 192 (16 colours applied using attributes) 1982 £125 (16K), £175 (48K)
Commodore 64 64K 40 x 25 320 x 200 (16 colours applied using attributes) 1982 £399

Perhaps the closest competitor, already being used in a fairly limited fashion in educational establishments in the UK, was the Commodore PET. However, it is clear that despite the adaptability of that system, its display capabilities were becoming increasingly uncompetitive, and Commodore had chosen to focus on the chipsets that would power the VIC-20 and Commodore 64 instead. (The designer of the PET went on to make the very capable, and understandably more expensive, Victor 9000/Sirius 1.) That Apple products were notoriously expensive and, indeed, the target of Commodore’s aggressive advertising did not seem to prevent them from capturing the US education market from the PET, but they always remained severely uncompetitive in the UK as commentators of the time indeed noted.

Later, the ZX Spectrum and Commodore 64 were released. Technology was progressing rapidly, and in hindsight one might have advocated waiting around until more capable and cheaper products came to market. However, it can be argued that in fulfilling virtually all aspects of the ambitious specification and pricing, it would not be until the release of the Amstrad CPC series in 1984 that a suitable alternative product might have become available. Even then, these Amstrad computers actually benefited from the experience accumulated in the UK computing industry from the introduction of the BBC Micro: they were, if anything, an iteration within the same generation of microcomputers and would even have used the same 6502 CPU as the BBC Micro had it not been for time-to-market pressures and the readily-available expertise with the Zilog Z80 CPU amongst those in the development team. And yet, specific aspects of the specification would still be unfulfilled: the BBC Micro had hardware support for Teletext displays, although it would have been possible to emulate these with a bitmapped display and suitable software.

Arise Sir Clive

Much has been made of the disappointment of Sir Clive Sinclair that his computers were not adopted by the BBC as products to be endorsed and targeted at schools. Sinclair made his name developing products that were competitive on price, often seeking cost-reduction measures to reach attractive pricing levels, but such measures also served to make his products less desirable. If one reads reviews of microcomputers from the early 1980s, many reviewers explicitly mention the quality of the keyboard provided by the computers being reviewed: a “typewriter” keyboard with keys that “travel” appear to be much preferred over the “calculator” keyboards provided by computers like the ZX Spectrum, Oric 1 or Newbury NewBrain, and they appear to be vastly preferred over the “membrane” keyboards employed by the ZX80, ZX81 and Atari 400.

For target audiences in education, business, and in the home, it would have been inconceivable to promote a product with anything less than a “proper” keyboard. Ultimately, the world had to wait until the ZX Spectrum +2 released in 1986 for a Spectrum with such a keyboard, and that occurred only after the product line had been acquired by Amstrad. (One might also consider the ZX Spectrum+ in 1984, but its keyboard was more of a hybrid of the calculator keyboard that had been used before and the “full-travel” keyboards provided by its competitors.)

Some people claim that they owe nothing to the BBC Micro and everything to the ZX Spectrum (or, indeed, the computer they happened to own) for their careers in computing. Certainly, the BBC Micro was an expensive purchase for many people, although contrary to popular assertion it was not any more expensive than the Commodore 64 upon that computer’s introduction in the UK, and for those of us who wanted BBC compatibility at home on a more reasonable budget, the Acorn Electron was really the only other choice. But it would be as childish as the playground tribalism that had everyone insist that their computer was “the best” to insist that the BBC Micro had no influence on computer literacy in general, or on the expectations of what computer systems should provide. Many people who owned a ZX Spectrum freely admit that the BBC Micro coloured their experiences, some even subsequently seeking to buy one or one of its successors and to go on to build a successful software development career.

The Costly IBM PC

Some commentators seem to consider the BBC Micro as having been an unnecessary diversion from the widespread adoption of the IBM PC throughout British society. As was the case everywhere else, the de-facto “industry standard” of the PC architecture and DOS captured much of the business market and gradually invaded the education sector from the top down, although significantly better products existed both before and after its introduction. It is tempting with hindsight to believe that by placing an early bet on the success of the then-new PC architecture, business and education could have somehow benefited from standardising on the PC and DOS applications. And there has always been the persistent misguided belief amongst some people that schools should be training their pupils/students for a narrow version of “the world of work”, as opposed to educating them to be able to deal with all aspects of their lives once their school days are over.

What many people forget or fail to realise is that the early 1980s witnessed rapid technological improvement in microcomputing, that there were many different systems and platforms, some already successful and established (such as CP/M), and others arriving to disrupt ideas of what computing should be like (the Xerox Alto and Star having paved the way for the Apple Lisa and Macintosh, the Atari ST, and so on). It was not clear that the IBM PC would be successful at all: IBM had largely avoided embracing personal computing, and although the system was favourably reviewed and seen as having the potential for success, thanks to IBM’s extensive sales organisation, other giants of the mainframe and minicomputing era such as DEC and HP were pursuing their own personal computing strategies. Moreover, existing personal computers were becoming entrenched in certain markets, and early adopters were building a familiarity with those existing machines that was reflected in publications and materials available at the time.

Despite the technical advantages of the IBM PC over much of the competition at the beginning of the 1980s, it was also substantially more expensive than the mass-market products arriving in significant numbers, aimed at homes, schools and small businesses. With many people remaining intrigued but unconvinced by the need for a personal computer, it would have been impossible for a school to justify spending almost £2000 (probably around £8000 today) on something without proven educational value. Software would also need to be purchased, and the procurement of expensive and potentially non-localised products would have created even more controversy.

Ultimately, the Computer Literacy Project stimulated the production of a wide range of locally-produced products at relatively inexpensive prices, and while there may have been a few years of children learning BBC BASIC instead of one of the variants of BASIC for the IBM PC (before BASIC became a deprecated aspect of DOS-based computing), it is hard to argue that those children missed out on any valuable experience using DOS commands or specific DOS-based products, especially since DOS became a largely forgotten environment itself as technological progress introduced new paradigms and products, making “hard-wired”, product-specific experience obsolete.

The Good and the Bad

Not everything about the BBC Micro and its introduction can be considered unconditionally good. Choices needed to be made to deliver a product that could fulfil the desired specification within certain technological constraints. Some people like to criticise BBC BASIC as being “non-standard”, for example, which neglects the diversity of BASIC dialects that existed at the dawn of the 1980s. Typically, for such people “standard” equates to “Microsoft”, but back then Microsoft BASIC was a number of different things. Commodore famously paid a one-off licence fee to use Microsoft BASIC in its products, but the version for the Commodore 64 was regarded as lacking user-friendly support for graphics primitives and other interesting hardware features. Meanwhile, the MSX range of microcomputers featured Microsoft Extended BASIC which did provide convenient access to hardware features, although the MSX range of computers were not the success at the low end of the market that Microsoft had probably desired to complement its increasing influence at the higher end through the IBM PC. And it is informative in this regard to see just how many variants of Microsoft BASIC were produced, thanks to Microsoft’s widespread licensing of its software.

Nevertheless, the availability of one company’s products do not make a standard, particularly if interoperability between those products is limited. Neither BBC BASIC nor Microsoft BASIC can be regarded as anything other than de-facto standards in their own territories, and it is nonsensical to regard one as non-standard when the other has largely the same characteristics as a proprietary product in widespread use, even if it was licensed to others, as indeed both Microsoft BASIC and BBC BASIC were. Genuine attempts to standardise BASIC did indeed exist, notably BASICODE, which was used in the distribution of programs via public radio broadcasts. One suspects that people making casual remarks about standard and non-standard things remain unaware of such initiatives. Meanwhile, Acorn did deliver implementations of other standards-driven programming languages such as COMAL, Pascal, Logo, Lisp and Forth, largely adhering to any standards subject to the limitations of the hardware.

However, what undermined the BBC Micro and Acorn’s related initiatives over time was the control that they as a single vendor had over the platform and its technologies. At the time, a “winner takes all” mentality prevailed: Commodore under Jack Tramiel had declared a “price war” on other vendors and had caused difficulties for new and established manufacturers alike, with Atari eventually being sold to Tramiel (who had resigned from Commodore) by Warner Communications, but many companies disappeared or were absorbed by others before half of the decade had passed. Indeed, Acorn, who had released the Electron to compete with Sinclair Research at the lower end of the market, and who had been developing product lines to compete in the business sector, experienced financial difficulties and was ultimately taken over by Olivetti; Sinclair, meanwhile, experienced similar difficulties and was acquired by Amstrad. In such a climate, ideas of collaboration seemed far from everybody’s minds.

Since then, the protagonists of the era have been able to reflect on such matters, Acorn co-founder Hermann Hauser admitting that it may have been better to license Acorn’s Econet local area networking technology to interested competitors like Commodore. Although the sentiments might have something to do with revenues and influence – it was at Acorn that the ARM processor was developed, sowing the seeds of a successful licensing business today – the rest of us may well ask what might have happened had the market’s participants of the era cooperated on things like standards and interoperability, helping their customers to protect their investments in technology, and building a bigger “common” market for third-party products. What if they had competed on bringing technological improvements to market without demanding that people abandon their existing purchases (and cause confusion amongst their users) just because those people happened to already be using products from a different vendor? It is interesting to see the range of available BBC BASIC implementations and to consider a world where such building blocks could have been adopted by different manufacturers, providing a common, heterogeneous platform built on cooperation and standards, not the imposition of a single hardware or software platform.

But That Was Then

Back then, as Richard Stallman points out, proprietary software was the norm. It would have been even more interesting had the operating systems and the available software for microcomputers been Free Software, but that may have been asking too much at the time. And although computer designs were often shared and published, a tendency to prevent copying of commercial computer designs prevailed, with Acorn and Sinclair both employing proprietary integrated circuits mostly to reduce complexity and increase performance, but partly to obfuscate their hardware designs, too. Thus, it may have been too much to expect something like the BBC Micro to have been open hardware to any degree “back in the day”, although circuit diagrams were published in publicly-available technical documentation.

But we have different expectations now. We expect software to be freely available for inspection, modification and redistribution, knowing that this empowers the end-users and reassures them that the software does what they want it to do, and that they remain in control of their computing environment. Increasingly, we also expect hardware to exhibit the same characteristics, perhaps only accepting that some components are particularly difficult to manufacture and that there are physical and economic restrictions on how readily we may practise the modification and redistribution of a particular device. Crucially, we demand control over the software and hardware we use, and we reject attempts to prevent us from exercising that control.

The big lesson to be learned from the early 1980s, to be considered now in the mid-2010s, is not how to avoid upsetting a popular (but ultimately doomed) participant in the computing industry, as some commentators might have everybody believe. It is to avoid developing proprietary solutions that favour specific organisations and that, despite the general benefits of increased access to technology, ultimately disempower the end-user. And in this era of readily available Free Software and open hardware platforms, the lesson to be learned is to strengthen such existing platforms and to work with them, letting those products and solutions participate and interoperate with the newly-introduced initiative in question.

The BBC Micro was a product of its time and its development was very necessary to fill an educational need. Contrary to the laziest of reports, the Micro Bit plays a different role as an accessory rather than as a complete home computer, at least if we may interpret the apparent intentions of its creators. But as a product of this era, our expectations for the Micro Bit are greater: we expect transparency and interoperability, the ability to make our own (just as one can with the Arduino, as long as one does not seek to call it an Arduino without asking permission from the trademark owner), and the ability to control exactly how it works. Whether there is a need to develop a completely new hardware solution remains an unanswered question, but we may assert that should it be necessary, such a solution should be made available as open hardware to the maximum possible extent. And of course, the software controlling it should be Free Software.

As we edge gradually closer to September and the big deployment, it will be interesting to assess how the device and the associated initiative measures up to our expectations. Let us hope that the right lessons from the days of the BBC Micro have indeed been learned!

The Unplanned Obsolescence of the First Fairphone Device

Friday, December 12th, 2014

About a year-and-a-half ago, I gave my impressions of the Fairphone, noting that the initiative was worthy in terms of its social and sustainability goals, but that it had neglected the “fairness” of the software to be provided with each device. Although the Fairphone organisation had made “root access” – or more correctly stated, “owner control” – of the device a priority and had decided to provide its user interface enhancements to Android as Free Software, it had chosen to use a set of hardware technologies with a poor record of support for Free Software.

It might be said that such an initiative cannot possibly hope to act in the most prudent manner in every respect. However, unlike expertise in minerals sourcing, complicated global supply chains, and proprietary manufacturing activities, expertise on matters of hardware support for Free Software is available almost in abundance to anyone who can be bothered to ask. Many people already struggle with poorly-supported hardware for which only binary firmware or driver releases are available from the manufacturer, often resulting in incorrectly-performing hardware with no chance of future fixes as the manufacturer discontinues support in order to focus on selling new products. Others struggle with continuing but inconvenient forms of support on the manufacturer’s own timescale and terms.

Consequently, there are increasing numbers of people with experience of reverse engineering, documenting, and reimplementing firmware and drivers for proprietary hardware, many of whom would only be too happy to share their experiences with others wishing to avoid the pitfalls of being tied to a proprietary hardware vendor with a proprietary software mentality. There are also communities developing open hardware who seek out enlightened hardware vendors that encourage Free Software drivers for their products and may even support firmware that is also Free Software on their devices. There are even people developing smartphones in the open whose experiences and opinions would surely be valuable to anyone needing advice on the more reliable, open and trustworthy hardware vendors.

One community that has remained active despite various setbacks is the one pursuing the development of the EOMA-68 modular computing platform. It was precisely this kind of “ODM versus chipset vendor versus Free Software community” circus that prompted the development of an open platform and attempts to reach out and cultivate constructive communications with various silicon vendors. Such vendors, notably Allwinner Technology (in the case of EOMA-68), but also other companies that have previously been open to dialogue, have had the realisation that Free Software is an asset, and that Free Software communities are their partners and not just a bunch of people whose work can be taken and used without paying attention to the terms under which that work was originally shared. Such dialogues are ongoing and are subject to setbacks as well as progress, but it is far better to cultivate good practices than to ignore bad practices and to dump the ugly result onto the end-user.

Now, the Fairphone organisation has started to reconsider the software issue in light of the real possibility that their device will not be upgradeable beyond an old release of Android:

“We are actively looking at ways to achieve this goal, but we’re trying to be realistic and face the fact that the first Fairphones will most likely not be upgraded beyond Android 4.2.”

Given that the viability of devices depends not only on the continued functioning of the hardware but also the correct functioning of the software, and that one motivation that many people have stated for upgrading their phone is to gain access to a supported operating system distribution and/or one that supports applications they need or desire, the unfortunate neglect of software sustainability has undermined the general sustainability of the device. It may very well be the case that the Fairphone organisation’s initiatives around re-use and recycling can mitigate the problems caused by any abandonment of these devices, as people seek out replacements that do what they demand of them, but one of the most potent goals of reducing consumption by providing a long-lasting device has been undermined by something that should be the easiest part of the product to change, maintain, upgrade and even to remedy shortcomings with the chosen physical components; something whose lifespan is dictated far less by physical constraints than the assembly of physical components making up the device itself.

It is quite possible that certain industry practices have remained unknown to the Fairphone organisation, despite bitter experiences elsewhere, and that they are only now catching up with what many other people have learned over the years:

“Our chipset vendor MediaTek is only publicly releasing what it is bound to by the obligatory terms of the GNU public license GPL (the Linux Kernel and a few userland programs) and has chosen not to release any of the Android source code.”

Once again, the GPL demonstrates its worth as a necessary tool to ensure that the end-user remains in control. Unfortunately, Google decided that the often shoddy practices of its hardware and industry partners should be indulged by allowing them to make proprietary products with Google’s permissively-licensed code. It could be worse: some hardware vendors violate the GPL and blame their suppliers, requiring anyone seeking recourse to traverse the supply chain as far as it goes, potentially to some obscure company in a faraway land whose management plead poverty while actually doing very nicely selling their services to anyone willing to pay; others just appear to brazenly violate the GPL and dare someone to sue them.

The Fairphone organisation could have valued the sustainability benefits of Free Software and cooperative hardware vendors. In doing so, by merely asking for informed opinions, they would have avoided this mess entirely. Unfortunately, they may have focused too narrowly only on certain worthy and necessary topics, maintaining an oversimplified view of software that, if mainstream media punditry is to be believed, is merely transient and interchangeable: something that can be made to run on any hardware as if by magic, with each upgrade replacing what was there before with something that is always better, only ever offering improvements and benefits. Those of us with more than a passing knowledge of systems development know that such beliefs are really delusions produced from a lack of experience or a wish to believe that unfamiliar things are easier than they actually are.

Since we cannot go back and change the way things were done before, I suppose that now is the time to deliver on the sustainability promise by fully and properly promoting and supporting Free Software on any future Fairphone device. Which means that the Fairphone organisation has to start listening to people with experience of reliably deploying and supporting Free Software on open or properly-documented hardware, instead of going along with whatever some supplier (and their potentially GPL-violating associates) would have them do just to get the contract in the bag and the device out of the door.