Paul Boddie's Free Software-related blog


Archive for the ‘hardware’ Category

Hardware Experiments with Fritzing

Friday, August 28th, 2015

One of my other interests, if you can even regard it as truly separate to my interests in Free Software and open hardware, involves the microcomputer systems of the 1980s that first introduced me to computing and probably launched me in the direction of my current career. There are many aspects of such systems that invite re-evaluation of their capabilities and limitations, leading to the consideration of improvements that could have been made at the time, as well as more radical enhancements that unashamedly employ technology that has only become available or affordable in recent years. Such “what if?” thought experiments and their hypothetical consequences are useful if we are to learn from the strategic mistakes once made by systems vendors, to have an informed perspective on current initiatives, and to properly appreciate computing history.

At the same time, people still enjoy actually using such systems today, writing new software and providing hardware that makes such continuing usage practical and sustainable. These computers and their peripherals are certainly “getting on”, and acquiring or rediscovering such old systems does not necessarily mean that you can plug them in and they still work as if they were new. Indeed, the lifetime of magnetic media and the devices that can read it, together with issues of physical decay in some components, mean that alternative mechanisms for loading and storing software have become attractive for some users, having been developed to complement or replace the cassette tape and floppy disk methods that those of us old enough to remember would have used “back in the day”.

My microcomputer of choice in the 1980s was the Acorn Electron – a cut-down, less expensive version of the BBC Microcomputer hardware platform – which supported only cassette storage in its unexpanded form. However, some expansion units added the disk interfaces present on the BBC Micro, while others added the ability to use ROM-based software. On the BBC Micro, one would plug ROM chips directly into sockets, and some expansion units for the Electron supported this method, too. The official Plus 1 expansion chose instead to support the more friendly expansion cartridge approach familiar to users of other computing and console systems, with ROM cartridges being the delivery method for games, applications and utilities in this form, providing nothing much more than a ROM chip and some logic inside a convenient-to-use cartridge.

The Motivation

A while ago, my brother, David, became interested in delivering software on cartridge for the Electron, and a certain amount of discussion led him to investigate various flash memory integrated circuits (ICs, chips), notably the AMD Am29F010 series. As technological progress continues, such devices provide a lot of storage in comparison to the ROM chips originally used with the Electron: the latter having only 16 kilobytes of capacity, whereas the Am29F010 variant chosen here has a capacity of 128 kilobytes. Meanwhile, others chose to look at EEPROM chips, notably the AT28C256 from Atmel.

Despite the manufacturing differences, both device types behave in a very similar way: a good idea for the manufacturers who could then sell products that would be compatible straight away with existing products and the mechanisms they use. In short, some kind of de-facto standard seems to apply to programming these devices, and so it should be possible to get something working with one and then switch to the other, especially if one kind becomes too difficult to obtain.

Now, some people realised that they could plug such devices into their microcomputers and program them “in place” using a clever hack where writes to the addresses that correspond to the memory provided by the EEPROM (or, indeed, flash memory device) in the computer’s normal memory map can be trivially translated into addresses that have significance to the EEPROM itself. But not routinely using such microcomputers myself, and wanting more flexibility in the programming of such devices, not to mention also avoiding the issue of getting software onto such computers so that it can be written to such non-volatile memory, it seemed like a natural course of action to try to do the programming with the help of some more modern conveniences.

And so I considered the idea of getting a microcontroller solution like the Arduino to do the programming work. Since an Arduino can be accessed over USB, a ROM image could be conveniently transferred from a modern computer and, with a suitable circuit wired up, programmed into the memory chip. ROM images can thus be obtained in the usual modern way – say, from the Internet – and then written straight to the memory chip via the Arduino, rather than having to be written first to some other medium and transferred through a more convoluted sequence of steps.

Breadboarding

Being somewhat familiar with Arduino experimentation, the first exercise was to make the circuit that can be used to program the memory device. Here, the first challenge presented itself: the chip employs 17 address lines, 8 data lines, and 3 control lines. Meanwhile, the Arduino Duemilanove only provides 14 digital pins and 6 analogue pins, with 2 of the digital pins (0 and 1) being unusable if the Arduino is communicating with a host, and another (13) being connected to the LED and being seemingly untrustworthy. Even with the analogue pins in service as digital output pins, only 17 pins would be available for interfacing.

The pin requirements
Arduino Duemilanove Am29F010
11 digital pins (2-12) 17 address pins (A0-A16)
6 analogue pins (0-6) 8 data pins (DQ0-DQ7)
3 control pins (CE#, OE#, WE#)
17 total 28 total

So, a way of multiplexing the Arduino pins was required, where at one point in time the Arduino would be issuing signals for one purpose, these signals would then be “stored” somewhere, and then at another point in time the Arduino would be issuing signals for another purpose. Ultimately, these signals would be combined and presented to the memory device in a hopefully coherent fashion. We cannot really do this kind of multiplexing with the control signals because they typically need to be coordinated to act in a timing-sensitive fashion, so we would be concentrating on the other signals instead.

So which signals would be stored and issued later? Well, with as many address lines needing signals as there are available pins on the Arduino, it would make sense to “break up” this block of signals into two. So, when issuing an address to the memory device, we would ideally be issuing 17 bits of information all at the same time, but instead we take approximately half of the them (8 bits) and issue the necessary signals for storage somewhere. Then, we would issue the other half or so (8 bits) for storage. At this point, we need only a maximum of 8 signal lines to communicate information through this mechanism. (Don’t worry, I haven’t forgotten the other address bit! More on that in a moment!)

How would we store these signals? Fortunately, I had considered such matters before and had ordered some 74-series logic chips for general interfacing, including 74HC273 flip-flop ICs. These can be given 8 bits of information and will then, upon command, hold that information while other signals may be present on its input pins. If we take two of these chips and attach their input pins to those 8 Arduino pins we wish to use for communication, we can “program” each 74HC273 in turn – one with 8 bits of an address, the other with another 8 bits – and then the output pins will be presenting 16 bits of the address to the memory chip. At this point, those 8 Arduino pins could even be doing something else because the 74HC273 chips will be holding the signal values from an earlier point in time and won’t be affected by signals presented to their input pins.

Of all the non-control signals, with 16 signals out of the way, that leaves only 8 signals for the memory chip’s data lines and that other address signal to deal with. But since the Arduino pins used to send address signals are free once the addresses are sent, we can re-use those 8 pins for the data signals. So, with our signal storage mechanism, we get away with only using 8 Arduino pins to send 24 pieces of information! We can live with allocating that remaining address signal to a spare Arduino pin.

Address and data pins
Arduino Duemilanove 74HC273 Am29F010
8 input/output pins 8 output pins 8 address pins (A0-A7)
8 output pins 8 address pins (A8-A15)
8 data pins (DQ0-DQ7)
1 output pin 1 address pin (A16)
9 total 25 total

That now leaves us with the task of managing the 3 control signals for the memory chip – to make it “listen” to the things we are sending to it – but at the same time, we also need to consider the control lines for those flip-flop ICs. Since it turns out that we need 1 control signal for each of the 74HC273 chips, we therefore need to allocate 5 additional interfacing pins on the Arduino for sending control signals to the different chips.

The final sums
Arduino Duemilanove 74HC273 Am29F010
8 input/output pins 8 output pins 8 address pins (A0-A7)
8 output pins 8 address pins (A8-A15)
8 data pins (DQ0-DQ7)
1 output pin 1 address pin (A16)
3 output pins 3 control pins (CE#, OE#, WE#)
2 output pins 2 control pins (CP for both ICs)
14 total 28 total

In the end, we don’t even need all the available pins on the Arduino, but the three going spare wouldn’t be enough to save us from having to use the flip-flop ICs.

With this many pins in use, and the need to connect them together, there are going to be a lot of wires in use:

The breadboard circuit with the Arduino and ICs

The breadboard circuit with the Arduino and ICs

The result is somewhat overwhelming! Presented in a more transparent fashion, and with some jumper wires replaced with breadboard wires, it is slightly easier to follow:

An overview of the breadboard circuit

An overview of the breadboard circuit

The orange wires between the two chips on the right-hand breadboard indicate how the 8 Arduino pins are connected beyond the two flip-flop chips and directly to the flash memory chip, which would sit on the left-hand breadboard between the headers inserted into that breadboard (which weren’t used in the previous arrangement).

Making a Circuit Board

It should be pretty clear that while breadboarding can help a lot with prototyping, things can get messy very quickly with even moderately complicated circuits. And while I was prototyping this, I was running out of jumper wires that I needed for other things! Although this circuit is useful, I don’t want to have to commit my collection of components to keeping it available “just in case”, but at the same time I don’t want to have to wire it up when I do need it. The solution to this dilemma was obvious: I should make a “proper” printed circuit board (PCB) and free up all my jumper wires!

It is easy to be quickly overwhelmed when thinking about making circuit boards. Various people recommend various different tools for designing them, ranging from proprietary software that might be free-of-charge in certain forms but which imposes arbitrary limitations on designs (as well as curtailing your software freedoms) through to Free Software that people struggle to recommend because they have experienced stability or functionality deficiencies with it. And beyond the activity of designing boards, the act of getting them made is confused by the range of services in various different places with differing levels of service and quality, not to mention those people who advocate making boards at home using chemicals that are, shall we say, not always kind to the skin.

Fortunately, I had heard of an initiative called Fritzing some time ago, initially in connection with various interesting products being sold in an online store, but whose store then appeared to be offering a service – Fritzing Fab – to fabricate individual circuit boards. What isn’t clear, or wasn’t really clear to me straight away, was that Fritzing is also some Free Software that can be used to design circuit boards. Conveniently, it is also available as a Debian package.

The Fritzing software aims to make certain tasks easy that would perhaps otherwise require a degree of familiarity with the practice of making circuit boards. For instance, having decided that I wanted to interface my circuit to an Arduino as a shield which sits on top and connects directly to the connectors on the Arduino board, I can choose an Arduino shield PCB template in the Fritzing software and be sure that if I then choose to get the board made, the dimensions and placement of the various connections will all be correct. So for my purposes and with my level of experience, Fritzing seems like a reasonable choice for a first board design.

Replicating the Circuit

Fritzing probably gets a certain degree of disdain from experienced practitioners of electronic design because it seems to emphasise the breadboard paradigm, rather than insisting that a proper circuit diagram (or schematic) acts as the starting point. Here is what my circuit looks like in Fritzing:

The breadboard view of my circuit in Fritzing

The breadboard view of my circuit in Fritzing

You will undoubtedly observe that it isn’t much tidier than my real-life breadboard layout! Having dragged a component like the Arduino Uno (mostly compatible with the Duemilanove) onto the canvas along with various breadboards, and then having dragged various other components onto those breadboards, all that remains is that we wire them up like we managed to do in reality. Here, Fritzing helps out by highlighting connections between things, so that breadboard columns appear green as wires are connected to them, indicating that an electrical connection is made and applies to all points in that column on that half of the breadboard (the upper or lower half as seen in the above image). It even highlights things that are connected together according to the properties of the device, so that any attempt to modify to a connection that leads to one of the ground pins on the Arduino also highlights the other ground pins as the modification is being done.

I can certainly understand criticism of this visual paradigm. Before wiring up the real-life circuit, I typically write down which things will be connected to each other in a simple table like this:

Example connections
Arduino 74HC273 #1 74HC273 #2 Am29F010
A5 CE#
A4 OE#
A3 WE#
2 CP
3 CP
4 D3 D3 DQ3

If I were not concerned with prototyping with breadboards, I would aim to use such information directly and not try and figure out which size breadboard I might need (or how many!) and how to arrange the wires so that signals get where they need to be. When one runs out of points in a breadboard column and has to introduce “staging” breadboards (as shown above by the breadboard hosting only incoming and outgoing wires), it distracts from the essential simplicity of a circuit.

Anyway, once the circuit is defined, and here it really does help that upon clicking on a terminal/pin, the connected terminals or pins are highlighted, we can move on to the schematic view and try and produce something that makes a degree of sense. Here is what that should look like in Fritzing:

The schematic for the circuit in Fritzing

The schematic for the circuit in Fritzing

Now, the observant amongst you will notice that this doesn’t look very tidy at all. First of all, there are wires going directly between terminals without any respect for tidiness whatsoever. The more observant will notice that some of the wires end in the middle of nowhere, although on closer inspection they appear to be aimed at a pin of an IC but are shifted to the right on the diagram. I don’t know what causes this phenomenon, but it would seem that as far as the software is concerned, they are connected to the component. (I will come back to how components are defined and the pitfalls involved later on.)

Anyway, one might be tempted to skip over this view and try and start designing a PCB layout directly, but I found that it helped to try and tidy this up a bit. First of all, the effects of the breadboard paradigm tend to manifest themselves with connections that do not really reflect the logical relationships between components, so that an Arduino pin that feeds an input pin on both flip-flop ICs as well as a data pin on the flash memory IC may have its connectors represented by a wire first going from the Arduino to one of the flip-flop ICs, then to the other flip-flop IC, and finally to the flash memory IC in some kind of sequential wiring. Although electrically this is not incorrect, with a thought to the later track routing on a PCB, it may not be the best representation to help us think about such subsequent problems.

So, for my own sanity, I rearranged the connections to “fan out” from the Arduino as much as possible. This was at times a frustrating exercise, as those of you with experience with drawing applications might recognise: trying to persuade the software that you really did select a particular thing and not something else, and so on. Again, selecting the end of a connection causes some highlighting to occur, and the desired result is that selecting a terminal highlights the appropriate terminals on the various components and not the unrelated ones.

Sometimes that highlighting behaviour provides surprising and counter-intuitive results. Checking the breadboard layout tends to be useful because Fritzing occasionally thinks that a new connection between certain pins has been established, and it helpfully creates a “rats nest” connection on the breadboard layout without apparently saying anything. Such “rats nest” connections are logical connections that have not been “made real” by the use of a wire, and they feature heavily in the PCB view.

PCB Layout

For those of us with no experience of PCB layout who just admire the PCBs in everybody else’s products, the task of laying out the tracks so that they make electrical sense is a daunting one. Fritzing will provide a canvas containing a board and the chosen components, but it is up to you to combine them in a sensible way. Here, the circuit board actually corresponds to the Arduino in the breadboard and schematic views.

But slightly confusing as the depiction of the Arduino is in the breadboard view, the pertinent aspects of it are merely the connectors on that device, not the functionality of the device itself which we obviously aren’t intending to replicate. So, instead of the details of an actual Arduino or its functional equivalent, we instead merely see the connection points required by the Arduino. And by choosing a board template for an Arduino shield, those connection points should appear in the appropriate places, as well as the board itself having the appropriate size and shape to be an Arduino shield.

Here’s how the completed board looks:

The upper surface of the PCB design in Fritzing

The upper surface of the PCB design in Fritzing

Of course, I have spared you a lot of work by just showing the image above. In practice, the components whose outlines and connectors feature above need to be positioned in sensible places. Then, tracks need to be defined connecting the different connection points, with dotted “rats nest” lines directly joining logically-connected points needing to be replaced with physical wiring in the form of those tracks. And of course, tracks do not enjoy the same luxury as the wires in the other views, of being able to cross over each other indiscriminately: they must be explicitly routed to the other side of the board, either using the existing connectors or by employing vias.

The lower surface of the PCB design in Fritzing

The lower surface of the PCB design in Fritzing

Hopefully, you will get to the point where there are no more dotted lines and where, upon selecting a connection point, all the appropriate points light up, just as we saw when probing the details of the other layouts. To reassure myself that I probably had connected everything up correctly, I went through my table and inspected the pin-outs of the components and did a kind of virtual electrical test, just to make sure that I wasn’t completely fooling myself.

With all this done, there isn’t much more to do before building up enough courage to actually get a board made, but one important step that remains is to run the “design checks” via the menu to see if there is anything that would prevent the board from working correctly or from otherwise being made. It can be the case that tracks do cross – the maze of yellow and orange can be distracting – or that they are too close and might cause signals to go astray. Fortunately, the hours of planning paid off here and only minor adjustments needed to be done.

It should be noted that the exercise of routing the tracks is certainly not to be underestimated when there are as many connections as there are above. Although an auto-routing function is provided, it failed to suggest tracks for most of the required connections and produced some bizarre routing as well. But clinging onto the memory of a working circuit in real three-dimensional space, along with the hope that two sides of a circuit board are enough and that there is enough space on the board, can keep the dream of a working design alive!

The Components

I skipped over the matter of components earlier on, and I don’t really want to dwell on the matter too much now, either. But one challenge that surprised me given the selection of fancy components that can be dragged onto the canvas was the lack of a simple template for a 32-pin DIP (dual in-line package) socket for the Am29F010 chip. There were socket definitions of different sizes, but it wasn’t possible to adjust the number of pins.

Now, there is a parts editor in Fritzing, but I tend to run away from graphical interfaces where I suspect that the matter could be resolved in more efficient ways, and it seems like other people feel the same way. Alongside the logical definition of the component’s connectors, one also has to consider the physical characteristics such as where the connectors are and what special markings will be reproduced on the PCB’s silk-screen for the component.

After copying an existing component, ransacking the Fritzing settings files, editing various files including those telling Fritzing about my new parts, I achieved my modest goals. But I would regard this as perhaps the weakest part of the software. I didn’t resort to doing things the behind-the-scenes way immediately, but the copy-and-edit paradigm was incredibly frustrating and doesn’t seem to be readily documented in a way I could usefully follow. There is a Sparkfun tutorial which describes things at length, but one cannot help feeling that a lot of this should be easier, especially for very simple component changes like the one I needed.

The Result

With some confidence and only modest expectations of success, I elected to place an order with the Fritzing Fab service and to see what the result would end up like. This was straightforward for the most part: upload the file created by Fritzing, fill out some details (albeit not via a secure connection), and then proceed to payment. Unfortunately, the easy payment method involves PayPal, and unfortunately PayPal wants random people like myself to create an account with them before they will consider letting me make a credit card payment, which is something that didn’t happen before. Fortunately, the Fritzing people are most accommodating and do support wire transfers as an alternative payment method, and they were very responsive to my queries, so I managed to get an order submitted even more quickly than I thought might happen (considering that fabrication happens only once a week).

Just over a week after placing my order, the board was shipped from Germany, arriving a couple of days later here in Norway. Here is what it looked like:

The finished PCB from Fritzing

The finished PCB from Fritzing

Now, all I had to do was to populate the board and to test the circuit again with the Arduino. First, I tested the connections using the Arduino’s 5V and GND pins with an LED in series with a resistor in an “old school” approach to the problem, and everything seemed to be as I had defined it in the Fritzing software.

Given that I don’t really like soldering things, the act of populating the board went about as well as expected, even though I could still clean up the residue from the solder a bit (which would lead me onto a story about buying the recommended chemicals that I won’t bother you with). Here is the result of that activity:

The populated board together with the Arduino

The populated board together with the Arduino

And, all that remained was the task of getting my software running and testing the circuit in its new form. Originally, I was only using 16 address pins, holding the seventeenth low, and had to change the software to handle these extended addresses. In addition, the issuing of commands to the flash memory device probably needed a bit of refinement as well. Consequently, this testing went on for a bit longer than I would have wished, but eventually I managed to successfully replicate the programming of a ROM image that had been done some time ago with the breadboard circuit.

The outcome did rely on a certain degree of good fortune: the template for the Arduino Uno is not quite compatible with the Duemilanove, but this was rectified by clipping two superfluous pins from one of the headers I soldered onto the board; two of the connections belonging to the socket holding the flash memory chip touch the outside of the plastic “power jack” socket, but not enough to cause a real problem. But I would like to think that a lot of preparation dealt with problems that otherwise might have occurred.

Apart from liberating my breadboards and wires, this exercise has provided useful experience with PCB design. And of course, you can find the sources for all of this in my repository, as well as a project page for the board on the Fritzing projects site. I hope that this account of my experiences will encourage others to consider trying it out, too. It isn’t as scary as it would first appear, after all, although I won’t deny that it was quite a bit of work!

New Fairphone, New Features, Same Old Software Story?

Saturday, August 15th, 2015

I must admit that I haven’t been following Fairphone of late, so it was a surprise to see that vague details of the second Fairphone device have been published on the Fairphone Web site. One aspect that seems to be a substantial improvement is that of hardware modularity. Since the popularisation of the notion that such a device could be built by combining functional units as if they were simple building blocks, with a lot of concepts, renderings and position statements coming from a couple of advocacy initiatives, not much else has actually happened in terms of getting devices out for people to use and develop further. And there are people with experience of designing such end-user products who are sceptical about the robustness and economics of such open-ended modular solutions. To see illustrations of a solution that will presumably be manufactured takes the idea some way along the road to validation.

If it is possible to, say, switch out the general-purpose computing unit of the Fairphone with another one, then it can be said that even if the Fairphone initiative fails once again to deliver a software solution that is entirely Free Software, perhaps because the choice of hardware obliges the initiative to deliver opaque “binary-only” payloads, then the opportunity might be there for others to deliver a bottom-to-top free-and-open solution as a replacement component. But one might hope that it should not be necessary to “opt in” to getting a system whose sources can be obtained, rebuilt and redeployed: that the second Fairphone device might have such desirable characteristics out of the box.

Now, it does seem that Fairphone acknowledges the existence and the merits of Free Software, at least in very broad terms. Reading the support site provides us with an insight into the current situation with regard to software freedom and Fairphone:

Our goal is to take a more open source approach to be able to offer owners more choice and control over their phone’s OS. For example, we want to make the source code available to the developer community and we are also in discussions with other OS vendors to look at the possibility of offering alternative operating systems for the Fairphone 2. However, at the moment there are parts of the software that are owned or licensed by third parties, so we are still investigating the technical and legal requirements to accomplish our goals of open software.

First of all, ignoring vague terms like “open software” that are susceptible to “openwashing” (putting the label “open” on something that really isn’t), it should be noted that various parts of the deployed software will, through their licensing, oblige the Fairphone initiative to make the corresponding source code available. This is not a matter that can be waved away with excuses about people’s hands being tied, that it is difficult to coordinate, or whatever else the average GPL-violating vendor might make. If copyleft-licensed code ships, the sources must follow.

Now there may also be proprietary software on the device (or permissively-licensed software bearing no obligation for anyone to release the corresponding source, which virtually amounts to the same thing) and that would clearly be against software freedom and should be something Fairphone should strongly consider avoiding, because neither end-users nor anyone who may wish to help those users would have any control over such software, and they would be completely dependent on the vendor, who in turn would be completely dependent on their suppliers, who in turn might suddenly not care about the viability of that software or the devices on which it is deployed. So much for sustainability under such circumstances!

As I noted before, having control over the software is not a perk for those who wish to “geek out” over the internals of a product: it is a prerequisite for product viability, longevity and sustainability. Let us hope that Fairphone can not only learn and apply the lessons from their first device, which may indeed have occurred with the choice of a potentially supportable chipset this time around, but that the initiative can also understand and embrace their obligations to those who produced the bulk of their software (as well as to their customers) in a coherent and concrete fashion. It would be a shame if, once again, an unwillingness to focus on software led to another missed opportunity, and the need for another version of the device to be brought to market to remedy deficiencies in what is otherwise a well-considered enterprise.

Now, if only Fairphone could organise their Web site in a more coherent fashion, putting useful summaries of essential information in obvious places instead of being buried in some random forum post

EOMA-68: The Return

Wednesday, April 8th, 2015

It is hard to believe that almost two years have passed since I criticised the Ubuntu Edge crowd-funding campaign for being a distraction from true open hardware initiatives (becoming one which also failed to reach its funding target, but was presumably good advertising for Ubuntu’s mobile efforts for a short while). Since then, the custodians of Ubuntu have pressed on with their publicity stunts, the most recent of which involving limited initial availability of an Ubuntu-branded smartphone that may very well have been shipping without the corresponding source code for the GPL-licensed software being available, even though it is now claimed that this issue has been remedied. Given the problems with the same chipset vendor in other products, I personally cannot help feeling that the matter might need more investigation, but then again, I personally do not have time to chase up licence compliance in other people’s products, either.

Meanwhile, some genuine open hardware initiatives were mentioned in that critique of Ubuntu’s mobile strategy: GTA04 is the continuing effort to produce a smartphone that continues the legacy of the Openmoko Neo FreeRunner, whose experiences are now helping to produce the Neo900 evolution of the Nokia N900 smartphone; Novena is an open hardware laptop that was eventually successfully crowd-funded and is in the process of shipping to backers; OpenPandora is a handheld games console, the experiences from which have since been harnessed to initiate the DragonBox Pyra product with a very similar physical profile and target audience. There is a degree of collaboration and continuity within some of these projects, too: the instigator of the GTA04 project is assisting with the Neo900 and the Pyra, for example, partly because these projects use largely the same hardware platform. And, of course, GNU/Linux is the foundation of the software for all this hardware.

But in general, open hardware projects remain fairly isolated entities, perhaps only clustering into groups around particular chipsets or hardware platforms. And when it comes to developing a physical device, the amount of re-use and sharing between projects is perhaps less than we might have come to expect from software, particularly Free Software. Not that this has necessarily slowed the deluge of boards, devices, products and crowd-funding campaigns: everywhere you look, there’s a new Arduino variant or something claiming to be the next big thing in the realm of the “Internet of Things” (IoT), but after a while one gets the impression that it is the same thing being funded and sold, over and over again, with the audience probably not realising that it has all mostly been done before.

The Case for Modularity

Against this backdrop, there is one interesting and somewhat unusual initiative that I have only briefly mentioned before: the development of the EOMA-68 (Embedded Open Modular Architecture 68) standard along with products to demonstrate it. Unlike the average single-board computer or system-on-module board, EOMA-68 attempts to define a widely-used modular computing unit which is also a complete computing device, delegating input (keyboards, mice, storage) and output (displays) to other devices. It has often been repeated that today phones are just general-purpose computers that happen to be able to make calls, and the same can be said for a lot of consumer electronics equipment that traditionally were either much simpler devices or which only employed special-purpose computing units to perform their work: televisions are a reasonably illustrative example of this.

And of course, computers as we know them come in all shapes and sizes now: phones, media players, handhelds, tablets, netbooks, laptops, desktops, workstations, and so on. But most of these devices are not built to be upgraded when the core computing part of them becomes obsolete or, at the very least, less attractive than the computing features of newer devices, nor can the purchaser mix and match the computing part of one device with the potentially more attractive parts of another: one kind of smart television may have a much better screen but a poorer user interface that one would want to replace, for example. There are workarounds – some people use USB-based “plug computers” to give their televisions “smart” capabilities – but when you buy a device, you typically have to settle for the bundled software and computing hardware (even if the software might eventually be modifiable thanks to the role of the GPL, subject to constraints imposed by manufacturers that might prevent modification).

With a modular computing unit, the element of choice is obviously enhanced, but it also helps those developing open hardware. First of all, the interface to the computing unit is well-defined, meaning that the designers of a device need not be overly concerned with the way the general-purpose computing functionality is to be provided beyond the physical demands of that particular module and the facilities provided by it. Beyond such constraints, being able to rely on a tested functional element, designers can focus on the elements of their device that differentiate it from other devices without having to master the integration of their own components of interest with those required for the computing functionality in one “make or break” hardware design that might prove too demanding to get right first time (or even second or third time). Prototyping complicated circuit designs can quickly incur considerable costs, and eliminating complexity from what might be described as the “peripheral board” – the part providing the input and output capabilities and the character of a particular device – not only reduces the risk of getting things wrong, but it could make the production of that board cheaper, too. And that might open up device design to a broader group of participants.

As Nico Rikken explains, EOMA-68 promises to offer benefits for hardware designers, software developers and customers. Modularity does make sense if properly considered, which is perhaps why other modularity initiatives like Phonebloks have plenty of critics even though they share the same worthy objectives of reducing waste and avoiding device obsolescence: with vague statements about modularity and the hint of everything being interchangeable and interoperating with everything, one cannot help be skeptical about the potential complexity and interoperability problems that could result, not to mention the ergonomic issues that most people can easily relate to. By focusing on the general-purpose computing aspect of modularity, EOMA-68 addresses the most important part of the hardware for Free Software and delegates modularity elsewhere in the system to other initiatives that do not claim to do it all.

A Few False Starts

Unfortunately, not everything has gone precisely according to schedule with EOMA-68 so far. After originally surfacing as part of an initiative to make a competitive ARM-based netbook, the plan was to make computing modules and “engineering boards” on the way to delivering a complete product, and the progress of the first module can be followed on the Allwinner A10 news page on the Rhombus Tech Web site. From initial interest from various parties at the start of 2012, and through a considerable amount of activity, by May 2013, working A10 boards were demonstrated running Debian Wheezy. And a follow-up board employing the Allwinner A20 instead of the A10 was demonstrated running Debian at the end of October 2014 as part of a micro-desktop solution.

One might have thought that these devices would be more widely available by now, particularly as development began in 2012 on a tablet board to complement the computing modules, with apparently steady progress being made. Now, the development of this tablet was driven by the opportunity to collaborate with the Vivaldi tablet project, whose own product had been rendered unusable for Free Software usage by the usual product iteration performed behind the scenes by the contract manufacturer changing the components in use without notice (as is often experienced by those buying computers to run Free Software operating systems, only to discover that the wireless chipset, say, is no longer one that is supported by Free Software). With this increased collaboration with KDE-driven hardware initiatives (Improv and Vivaldi), efforts seemingly became directed towards satisfying potential customers within the framework of those other initiatives, so that to acquire the micro-engineering board one would seek to purchase an Improv board instead, and to obtain a complete tablet product one would place an advance order for the Vivaldi tablet instead of anything previously under development.

Somehow during 2014, the collaboration between the participants in this broader initiative appears to have broken down, with there undoubtedly being different perspectives on the sequence of events that led to the cancellation of Improv and Vivaldi. Trawling the mailing list archives gives more detail but not much more clarity, and it can perhaps only be said that mistakes may have been made and that everybody learned new things about certain aspects of doing business with other people. The effect, especially in light of the deluge of new and shiny products for casual observers to purchase instead of engaging in this community, and with many people presumably being told that their Vivaldi tablet would not be shipping after all, probably meant that many people lost interest and, indeed, hope that there would be anything worth holding out for.

The Show Goes On

One might have thought that such a setback would have brought about the end of the initiative, but its instigator shows no sign of quitting, probably because genuine hardware has been made, and other opportunities and collaborations have been created on the way. Initially, the focus was on an ARM-based netbook or tablet that would run Free Software without the vendor neglecting to provide the complete corresponding source for things like the Linux kernel and bootloader required to operate the device. This requirement for licence compliance has not disappeared or diminished, with continuing scrutiny placed on vendors to make sure that they are not just throwing binaries over the wall.

But as experience was gained in evaluating suitable CPUs, it was not only ARM CPUs that were found to have the necessary support characteristics for software freedom as well as for low power consumption. The Ingenic jz4775, a sibling of the rather less capable jz4720 used by the Ben NanoNote, uses the MIPS architecture and may well be fully supported by the mainline Linux kernel in the near future; the ICubeCorp IC1T is a more exotic CPU that can be supported by Free Software toolchains and should be able to run the Linux kernel in addition to Android. Alongside these, the A20 remains the most suitable of the options from Allwinner, whose products have always been competitively priced (which has also been a consideration), but there are other ARM derivatives that would be more interesting from a vendor cooperation perspective, notably the TI AM389x series of CPUs.

Meanwhile, after years of questions about whether a crowd-funding campaign would be started to attract customers and to get the different pieces of hardware built in quantity, plans for such a campaign are now underway. While initial calls for a campaign may have been premature, I now think that the time is right: people have been using the hardware already produced for some time, and considerable experience has been amassed along the way up to this point; the risks should be substantially lower than quite a few other crowd-funding campaigns that seem to be approved and funded these days. Not that anyone should seek to conceal the nature of crowd-funding and the in-built element of risk associated with such campaigns, of course: it is not the same as buying a product from a store.

Nevertheless, I would be very interested to see this hardware being made, and I am even on record as having said so. Part of this is selfishness: I could do with some newer, quieter, less power-consuming hardware. But I also think that a choice of different computing modules, supporting Free Software operating systems out of the box, with some of them candidates for FSF endorsement, and offering a diversity of architectures, would be beneficial to a sustainable computing infrastructure in the longer term. If you also think so, maybe you should follow the progress of EOMA-68 over the coming weeks and months, too.

Open Hardware and Free Software: Not Just For The Geeks

Saturday, April 4th, 2015

Having seen my previous article about the Fairphone initiative’s unfortunate choice of technologies mentioned in various discussions about the Fairphone, I feel a certain responsibility to follow up on some of the topics and views that tend to get aired in these discussions. In response to an article about an “open operating system” for the Fairphone, a rather solid comment was made about how the initiative still seems to be approaching the problem from the wrong angle.

Because the article comments have been delegated to a proprietary service that may at some point “garbage-collect” them from the public record, I reproduce the comment here (and I also expanded the link previously provided by a link-shortening service for similar and other reasons):

You are having it all upside down.
Just make your platform open instead of using proprietary chipsets with binary blobs! Then porting Firefox OS to the Fairphone would be easy as pie.

Not listening to the people who said that only free software running on open hardware would be really fair is exactly what brought you this mess: Our approach to software and ongoing support for the first Fairphones
It is also why I advised all of my friends and acquaintances not to order a Fairphone until it becomes a platform that respects user freedom. Turns out I was more than right.
If the Fairphone was an open platform that could run Firefox OS, Replicant or pure Debian, I would tell everybody in need of a cellphone to buy one.

I don’t know the person who wrote this comment, but it is very well-formulated, and one wouldn’t think that there would be much to add. Unfortunately, some people seem to carry around their own misconceptions about some of the concepts mentioned above, and unfortunately, they are quite happy to propagate those misconceptions as if they were indisputable facts. Below, I state the real facts in the headings and quote each one of the somewhat less truthful misconceptions for further scrutiny.

Open Hardware and Free Software is for Everyone

Fairphone should not make the mistake of producing a phone for geeks. Instead, it should become a phone for everyone.

Just because people have an opinion about technology and wish to see certain guarantees made about the nature of that technology does not mean that the result is “for geeks”. In fact, making the hardware open means that more people can figure things out about it, improve it, understand it, and improve the way it works and the software that uses it. Making the software truly open means that more people can change it, fix it, enhance it, and extend the usable life of the device. All of this benefits everyone, whereas closed hardware and proprietary software ultimately benefit only the small groups of people who respectively designed the device and wrote the software, both of whom being very likely to lose interest in sustaining the life of that product as soon as they have another one they want to sell you. (And often, in the case of the hardware, as soon as it leaves the factory.)

User Freedom Means Exactly User Freedom

‘User freedom’ is often used when actually ‘developers freedom’ is meant. It is more of an ideology.

Incorrect! Those of us who use the term Free Software know exactly what we mean: it is the freedom of the end-user to exercise precisely those privileges that have resulted in the work being produced and delivered to them. Now, there are people who advocate “permissive licences” that do favour developers in that they allow people to use the work of others and to then provide a piece of software under conditions that grants the end-user only limited privileges, taking away those privileges to see how the entire work is constructed, along with those that allow the entire work to be improved and shared. Whether one sees either of these as an ideology, presumably emphasising one’s own “pragmatism” in contrast, is largely irrelevant because the genuine pragmatism involved in Free Software and the propagation of a broader set of privileges actually delivers sustainability: users – genuine end-users, not middle-men – get the freedom to participate in how the product turns out, and crucially, how it lives on after the original producer has decided to go off and do something else.

Openness Does Not Preclude Fanciness (But Security Requires Openness)

What people want is: user friendly interface, security/privacy, good specs and ability to install apps and games. […] OpenSource is a nice idea, but has its disadvantages too: who is caring about quality?

It’s just too easy for people to believe claims about privacy and security, even after everybody found out that they were targets of widespread surveillance, even after various large corporations who presumably care about their reputations have either lost the personal details of their users to criminals or have shared those details with others (who also have criminal or unethical intent), and when believing the sales-pitch about total privacy and robust security, those people will happily reassure themselves and others that no company would allow its reputation to be damaged by any breach of privacy or security! But there are no guarantees of security or privacy if you cannot trust the systems you use, and there is no way of trusting them without being able to inspect how they work. More than ever, people need genuine guarantees of security and privacy – not reassurances from salesmen and advertisers – and the best way to start off on the path towards such guarantees is to be able to deploy Free Software on a device that you fully control.

And as for quality, user-friendliness and all the desirable stuff: how many people use products like Firefox in its various forms every single day? Such Free Software solutions have not merely set the standard over the years, but they have kept technologies like the Web relevant and viable, in stark contrast to proprietary bundled programs like Internet Explorer that have actually impaired technological and social progress, with “IE” doing its bit by exhibiting a poor record of adherence to standards and a continuous parade of functionality and security bugs, not to mention constant usability frustrations endured by its unfortunate (and frequently involuntary) audience of users.

Your Priorities Make Free Software Important

I found the following comment to be instructive:

For me open source isn’t important. My priorities are longevity/updates, support, safety/privacy.

The problem is this: how can you guarantee longevity, updates, support, safety and privacy without openness? Safety and privacy would require you to have blind trust in someone whose claims you cannot verify. Longevity, updates and support require you to rely on the original producer’s continued interest in the product that you have just purchased from them, and should it become more profitable for them to focus on other products (that they might want you to buy instead of continuing to use the one you have), you might be able to rely on the goodwill of that producer to transfer their responsibilities to others to do the thankless tasks of maintenance and support. But it may well be the case that no amount of money will be able to keep that product viable for you: the producer may simply refuse to support it or to let others support it. Perhaps some people may step in and reverse-engineer the product and make an effort to keep it viable, but wouldn’t it be better to have an open product to start with, where people can choose how it is maintained – and thus sustained – for as long as people still want to use it?

Concepts like open hardware and Free Software sound like topics for the particularly-interested, but they provide the foundations for those topics of increasing interest and attention that people claim to care so much about. Everybody deserves things like choice, democracy, privacy, security, safety, control over their own lives and destinies, and so on. Closed hardware and proprietary software may be used on lots of devices, and people may be getting a lot of use out of those devices, but the users of those devices enjoy the benefits only as long as it remains in the interests of the producers of those devices and the accompanying software to allow them to do so. Furthermore, few or none of those users can be sure whether any of those important things – their rights – are being impaired by their use of those devices. Are their communications being intercepted, collected, analysed? Few people would ever know.

Free Software and open hardware empower their users with the control that proprietary technologies deny their users. But shouldn’t everybody be able to benefit from such control? That’s why a device that is open hardware and which runs Free Software really is for everyone, not just for “geeks”.

The BBC Micro and the BBC Micro Bit

Sunday, March 22nd, 2015

At least on certain parts of the Internet as well as in other channels, there has been a degree of excitement about the announcement by the BBC of a computing device called the “Micro Bit“, with the BBC’s plan to give one of these devices to each child starting secondary school, presumably in September 2015, attracting particular attention amongst technology observers and television licence fee-payers alike. Details of the device are a little vague at the moment, but the announcement along with discussions of the role of the corporation and previous initiatives of this nature provides me with an opportunity to look back at the original BBC Microcomputer, evaluate some of the criticisms (and myths) around the associated Computer Literacy Project, and to consider the things that were done right and wrong, with the latter hopefully not about to be repeated in this latest endeavour.

As the public record reveals, at the start of the 1980s, the BBC wanted to engage its audience beyond television programmes describing the growing microcomputer revolution, and it was decided that to do this and to increase computer literacy generally, it would need to be able to demonstrate various concepts and technologies on a platform that would be able to support the range of activities to which computers were being put to use. Naturally, a demanding specification was constructed – clearly, the scope of microcomputing was increasing rapidly, and there was a lot to demonstrate – and various manufacturers were invited to deliver products that could be used as this reference platform. History indicates that a certain amount of acrimony followed – a complete description of which could fill an entire article of its own – but ultimately only Acorn Computers managed to deliver a machine that could do what the corporation was asking for.

An Ambitious Specification

It is worth considering what the BBC Micro was offering in 1981, especially when considering ill-informed criticism of the machine’s specifications by people who either prefer other systems or who felt that participating in the development of such a machine was none of the corporation’s business. The technologies to be showcased by the BBC’s programme-makers and supported by the materials and software developed for the machine included full-colour graphics, multi-channel sound, 80-column text, Viewdata/Teletext, cassette and diskette storage, local area networking, interfacing to printers, joysticks and other input/output devices, as well as to things like robots and user-developed devices. Although it is easy to pick out one or two of these requirements, move forwards a year or two, increase the budget two- or three-fold, or any combination of these things, and to nominate various other computers, there really were few existing systems that could deliver all of the above, at least at an affordable price at the time.

Some microcomputers of the early 1980s
Computer RAM Text Graphics Year Price
Apple II Plus Up to 64K 40 x 25 (upper case only) 280 x 192 (6 colours), 40 x 48 (16 colours) 1979 £1500 or more
Commodore PET 4032/8032 32K 40/80 x 25 Graphics characters (2 colours) 1980 £800 (4032), £1030 (8032) (including monochrome monitor)
Commodore VIC-20 5K 22 x 23 176 x 184 (8 colours) 1980 (1981 outside Japan) £199
IBM PC (Model 5150) 16K up to 256K 40/80 x 25 640 x 200 (2 colours), 320 x 200 (4 colours) 1981 £1736 (including monochrome monitor, presumably with 16K or 64K)
BBC Micro (Model B) 32K 80/40/20 x 32/24, Teletext 640 x 256 (2 colours), 320 x 256 (2/4 colours), 160 x 256 (4/8 colours) 1981 £399 (originally £335)
Research Machines LINK 480Z 64K (expandable to 256K) 40 x 24 (optional 80 x 24) 160 x 72, 80 x 72 (2 colours); expandable to 640 x 192 (2 colours), 320 x 192 (4 colours), 190 x 96 (8 colours or 16 shades) 1981 £818
ZX Spectrum 16K or 48K 32 x 24 256 x 192 (16 colours applied using attributes) 1982 £125 (16K), £175 (48K)
Commodore 64 64K 40 x 25 320 x 200 (16 colours applied using attributes) 1982 £399

Perhaps the closest competitor, already being used in a fairly limited fashion in educational establishments in the UK, was the Commodore PET. However, it is clear that despite the adaptability of that system, its display capabilities were becoming increasingly uncompetitive, and Commodore had chosen to focus on the chipsets that would power the VIC-20 and Commodore 64 instead. (The designer of the PET went on to make the very capable, and understandably more expensive, Victor 9000/Sirius 1.) That Apple products were notoriously expensive and, indeed, the target of Commodore’s aggressive advertising did not seem to prevent them from capturing the US education market from the PET, but they always remained severely uncompetitive in the UK as commentators of the time indeed noted.

Later, the ZX Spectrum and Commodore 64 were released. Technology was progressing rapidly, and in hindsight one might have advocated waiting around until more capable and cheaper products came to market. However, it can be argued that in fulfilling virtually all aspects of the ambitious specification and pricing, it would not be until the release of the Amstrad CPC series in 1984 that a suitable alternative product might have become available. Even then, these Amstrad computers actually benefited from the experience accumulated in the UK computing industry from the introduction of the BBC Micro: they were, if anything, an iteration within the same generation of microcomputers and would even have used the same 6502 CPU as the BBC Micro had it not been for time-to-market pressures and the readily-available expertise with the Zilog Z80 CPU amongst those in the development team. And yet, specific aspects of the specification would still be unfulfilled: the BBC Micro had hardware support for Teletext displays, although it would have been possible to emulate these with a bitmapped display and suitable software.

Arise Sir Clive

Much has been made of the disappointment of Sir Clive Sinclair that his computers were not adopted by the BBC as products to be endorsed and targeted at schools. Sinclair made his name developing products that were competitive on price, often seeking cost-reduction measures to reach attractive pricing levels, but such measures also served to make his products less desirable. If one reads reviews of microcomputers from the early 1980s, many reviewers explicitly mention the quality of the keyboard provided by the computers being reviewed: a “typewriter” keyboard with keys that “travel” appear to be much preferred over the “calculator” keyboards provided by computers like the ZX Spectrum, Oric 1 or Newbury NewBrain, and they appear to be vastly preferred over the “membrane” keyboards employed by the ZX80, ZX81 and Atari 400.

For target audiences in education, business, and in the home, it would have been inconceivable to promote a product with anything less than a “proper” keyboard. Ultimately, the world had to wait until the ZX Spectrum +2 released in 1986 for a Spectrum with such a keyboard, and that occurred only after the product line had been acquired by Amstrad. (One might also consider the ZX Spectrum+ in 1984, but its keyboard was more of a hybrid of the calculator keyboard that had been used before and the “full-travel” keyboards provided by its competitors.)

Some people claim that they owe nothing to the BBC Micro and everything to the ZX Spectrum (or, indeed, the computer they happened to own) for their careers in computing. Certainly, the BBC Micro was an expensive purchase for many people, although contrary to popular assertion it was not any more expensive than the Commodore 64 upon that computer’s introduction in the UK, and for those of us who wanted BBC compatibility at home on a more reasonable budget, the Acorn Electron was really the only other choice. But it would be as childish as the playground tribalism that had everyone insist that their computer was “the best” to insist that the BBC Micro had no influence on computer literacy in general, or on the expectations of what computer systems should provide. Many people who owned a ZX Spectrum freely admit that the BBC Micro coloured their experiences, some even subsequently seeking to buy one or one of its successors and to go on to build a successful software development career.

The Costly IBM PC

Some commentators seem to consider the BBC Micro as having been an unnecessary diversion from the widespread adoption of the IBM PC throughout British society. As was the case everywhere else, the de-facto “industry standard” of the PC architecture and DOS captured much of the business market and gradually invaded the education sector from the top down, although significantly better products existed both before and after its introduction. It is tempting with hindsight to believe that by placing an early bet on the success of the then-new PC architecture, business and education could have somehow benefited from standardising on the PC and DOS applications. And there has always been the persistent misguided belief amongst some people that schools should be training their pupils/students for a narrow version of “the world of work”, as opposed to educating them to be able to deal with all aspects of their lives once their school days are over.

What many people forget or fail to realise is that the early 1980s witnessed rapid technological improvement in microcomputing, that there were many different systems and platforms, some already successful and established (such as CP/M), and others arriving to disrupt ideas of what computing should be like (the Xerox Alto and Star having paved the way for the Apple Lisa and Macintosh, the Atari ST, and so on). It was not clear that the IBM PC would be successful at all: IBM had largely avoided embracing personal computing, and although the system was favourably reviewed and seen as having the potential for success, thanks to IBM’s extensive sales organisation, other giants of the mainframe and minicomputing era such as DEC and HP were pursuing their own personal computing strategies. Moreover, existing personal computers were becoming entrenched in certain markets, and early adopters were building a familiarity with those existing machines that was reflected in publications and materials available at the time.

Despite the technical advantages of the IBM PC over much of the competition at the beginning of the 1980s, it was also substantially more expensive than the mass-market products arriving in significant numbers, aimed at homes, schools and small businesses. With many people remaining intrigued but unconvinced by the need for a personal computer, it would have been impossible for a school to justify spending almost £2000 (probably around £8000 today) on something without proven educational value. Software would also need to be purchased, and the procurement of expensive and potentially non-localised products would have created even more controversy.

Ultimately, the Computer Literacy Project stimulated the production of a wide range of locally-produced products at relatively inexpensive prices, and while there may have been a few years of children learning BBC BASIC instead of one of the variants of BASIC for the IBM PC (before BASIC became a deprecated aspect of DOS-based computing), it is hard to argue that those children missed out on any valuable experience using DOS commands or specific DOS-based products, especially since DOS became a largely forgotten environment itself as technological progress introduced new paradigms and products, making “hard-wired”, product-specific experience obsolete.

The Good and the Bad

Not everything about the BBC Micro and its introduction can be considered unconditionally good. Choices needed to be made to deliver a product that could fulfil the desired specification within certain technological constraints. Some people like to criticise BBC BASIC as being “non-standard”, for example, which neglects the diversity of BASIC dialects that existed at the dawn of the 1980s. Typically, for such people “standard” equates to “Microsoft”, but back then Microsoft BASIC was a number of different things. Commodore famously paid a one-off licence fee to use Microsoft BASIC in its products, but the version for the Commodore 64 was regarded as lacking user-friendly support for graphics primitives and other interesting hardware features. Meanwhile, the MSX range of microcomputers featured Microsoft Extended BASIC which did provide convenient access to hardware features, although the MSX range of computers were not the success at the low end of the market that Microsoft had probably desired to complement its increasing influence at the higher end through the IBM PC. And it is informative in this regard to see just how many variants of Microsoft BASIC were produced, thanks to Microsoft’s widespread licensing of its software.

Nevertheless, the availability of one company’s products do not make a standard, particularly if interoperability between those products is limited. Neither BBC BASIC nor Microsoft BASIC can be regarded as anything other than de-facto standards in their own territories, and it is nonsensical to regard one as non-standard when the other has largely the same characteristics as a proprietary product in widespread use, even if it was licensed to others, as indeed both Microsoft BASIC and BBC BASIC were. Genuine attempts to standardise BASIC did indeed exist, notably BASICODE, which was used in the distribution of programs via public radio broadcasts. One suspects that people making casual remarks about standard and non-standard things remain unaware of such initiatives. Meanwhile, Acorn did deliver implementations of other standards-driven programming languages such as COMAL, Pascal, Logo, Lisp and Forth, largely adhering to any standards subject to the limitations of the hardware.

However, what undermined the BBC Micro and Acorn’s related initiatives over time was the control that they as a single vendor had over the platform and its technologies. At the time, a “winner takes all” mentality prevailed: Commodore under Jack Tramiel had declared a “price war” on other vendors and had caused difficulties for new and established manufacturers alike, with Atari eventually being sold to Tramiel (who had resigned from Commodore) by Warner Communications, but many companies disappeared or were absorbed by others before half of the decade had passed. Indeed, Acorn, who had released the Electron to compete with Sinclair Research at the lower end of the market, and who had been developing product lines to compete in the business sector, experienced financial difficulties and was ultimately taken over by Olivetti; Sinclair, meanwhile, experienced similar difficulties and was acquired by Amstrad. In such a climate, ideas of collaboration seemed far from everybody’s minds.

Since then, the protagonists of the era have been able to reflect on such matters, Acorn co-founder Hermann Hauser admitting that it may have been better to license Acorn’s Econet local area networking technology to interested competitors like Commodore. Although the sentiments might have something to do with revenues and influence – it was at Acorn that the ARM processor was developed, sowing the seeds of a successful licensing business today – the rest of us may well ask what might have happened had the market’s participants of the era cooperated on things like standards and interoperability, helping their customers to protect their investments in technology, and building a bigger “common” market for third-party products. What if they had competed on bringing technological improvements to market without demanding that people abandon their existing purchases (and cause confusion amongst their users) just because those people happened to already be using products from a different vendor? It is interesting to see the range of available BBC BASIC implementations and to consider a world where such building blocks could have been adopted by different manufacturers, providing a common, heterogeneous platform built on cooperation and standards, not the imposition of a single hardware or software platform.

But That Was Then

Back then, as Richard Stallman points out, proprietary software was the norm. It would have been even more interesting had the operating systems and the available software for microcomputers been Free Software, but that may have been asking too much at the time. And although computer designs were often shared and published, a tendency to prevent copying of commercial computer designs prevailed, with Acorn and Sinclair both employing proprietary integrated circuits mostly to reduce complexity and increase performance, but partly to obfuscate their hardware designs, too. Thus, it may have been too much to expect something like the BBC Micro to have been open hardware to any degree “back in the day”, although circuit diagrams were published in publicly-available technical documentation.

But we have different expectations now. We expect software to be freely available for inspection, modification and redistribution, knowing that this empowers the end-users and reassures them that the software does what they want it to do, and that they remain in control of their computing environment. Increasingly, we also expect hardware to exhibit the same characteristics, perhaps only accepting that some components are particularly difficult to manufacture and that there are physical and economic restrictions on how readily we may practise the modification and redistribution of a particular device. Crucially, we demand control over the software and hardware we use, and we reject attempts to prevent us from exercising that control.

The big lesson to be learned from the early 1980s, to be considered now in the mid-2010s, is not how to avoid upsetting a popular (but ultimately doomed) participant in the computing industry, as some commentators might have everybody believe. It is to avoid developing proprietary solutions that favour specific organisations and that, despite the general benefits of increased access to technology, ultimately disempower the end-user. And in this era of readily available Free Software and open hardware platforms, the lesson to be learned is to strengthen such existing platforms and to work with them, letting those products and solutions participate and interoperate with the newly-introduced initiative in question.

The BBC Micro was a product of its time and its development was very necessary to fill an educational need. Contrary to the laziest of reports, the Micro Bit plays a different role as an accessory rather than as a complete home computer, at least if we may interpret the apparent intentions of its creators. But as a product of this era, our expectations for the Micro Bit are greater: we expect transparency and interoperability, the ability to make our own (just as one can with the Arduino, as long as one does not seek to call it an Arduino without asking permission from the trademark owner), and the ability to control exactly how it works. Whether there is a need to develop a completely new hardware solution remains an unanswered question, but we may assert that should it be necessary, such a solution should be made available as open hardware to the maximum possible extent. And of course, the software controlling it should be Free Software.

As we edge gradually closer to September and the big deployment, it will be interesting to assess how the device and the associated initiative measures up to our expectations. Let us hope that the right lessons from the days of the BBC Micro have indeed been learned!

The Unplanned Obsolescence of the First Fairphone Device

Friday, December 12th, 2014

About a year-and-a-half ago, I gave my impressions of the Fairphone, noting that the initiative was worthy in terms of its social and sustainability goals, but that it had neglected the “fairness” of the software to be provided with each device. Although the Fairphone organisation had made “root access” – or more correctly stated, “owner control” – of the device a priority and had decided to provide its user interface enhancements to Android as Free Software, it had chosen to use a set of hardware technologies with a poor record of support for Free Software.

It might be said that such an initiative cannot possibly hope to act in the most prudent manner in every respect. However, unlike expertise in minerals sourcing, complicated global supply chains, and proprietary manufacturing activities, expertise on matters of hardware support for Free Software is available almost in abundance to anyone who can be bothered to ask. Many people already struggle with poorly-supported hardware for which only binary firmware or driver releases are available from the manufacturer, often resulting in incorrectly-performing hardware with no chance of future fixes as the manufacturer discontinues support in order to focus on selling new products. Others struggle with continuing but inconvenient forms of support on the manufacturer’s own timescale and terms.

Consequently, there are increasing numbers of people with experience of reverse engineering, documenting, and reimplementing firmware and drivers for proprietary hardware, many of whom would only be too happy to share their experiences with others wishing to avoid the pitfalls of being tied to a proprietary hardware vendor with a proprietary software mentality. There are also communities developing open hardware who seek out enlightened hardware vendors that encourage Free Software drivers for their products and may even support firmware that is also Free Software on their devices. There are even people developing smartphones in the open whose experiences and opinions would surely be valuable to anyone needing advice on the more reliable, open and trustworthy hardware vendors.

One community that has remained active despite various setbacks is the one pursuing the development of the EOMA-68 modular computing platform. It was precisely this kind of “ODM versus chipset vendor versus Free Software community” circus that prompted the development of an open platform and attempts to reach out and cultivate constructive communications with various silicon vendors. Such vendors, notably Allwinner Technology (in the case of EOMA-68), but also other companies that have previously been open to dialogue, have had the realisation that Free Software is an asset, and that Free Software communities are their partners and not just a bunch of people whose work can be taken and used without paying attention to the terms under which that work was originally shared. Such dialogues are ongoing and are subject to setbacks as well as progress, but it is far better to cultivate good practices than to ignore bad practices and to dump the ugly result onto the end-user.

Now, the Fairphone organisation has started to reconsider the software issue in light of the real possibility that their device will not be upgradeable beyond an old release of Android:

“We are actively looking at ways to achieve this goal, but we’re trying to be realistic and face the fact that the first Fairphones will most likely not be upgraded beyond Android 4.2.”

Given that the viability of devices depends not only on the continued functioning of the hardware but also the correct functioning of the software, and that one motivation that many people have stated for upgrading their phone is to gain access to a supported operating system distribution and/or one that supports applications they need or desire, the unfortunate neglect of software sustainability has undermined the general sustainability of the device. It may very well be the case that the Fairphone organisation’s initiatives around re-use and recycling can mitigate the problems caused by any abandonment of these devices, as people seek out replacements that do what they demand of them, but one of the most potent goals of reducing consumption by providing a long-lasting device has been undermined by something that should be the easiest part of the product to change, maintain, upgrade and even to remedy shortcomings with the chosen physical components; something whose lifespan is dictated far less by physical constraints than the assembly of physical components making up the device itself.

It is quite possible that certain industry practices have remained unknown to the Fairphone organisation, despite bitter experiences elsewhere, and that they are only now catching up with what many other people have learned over the years:

“Our chipset vendor MediaTek is only publicly releasing what it is bound to by the obligatory terms of the GNU public license GPL (the Linux Kernel and a few userland programs) and has chosen not to release any of the Android source code.”

Once again, the GPL demonstrates its worth as a necessary tool to ensure that the end-user remains in control. Unfortunately, Google decided that the often shoddy practices of its hardware and industry partners should be indulged by allowing them to make proprietary products with Google’s permissively-licensed code. It could be worse: some hardware vendors violate the GPL and blame their suppliers, requiring anyone seeking recourse to traverse the supply chain as far as it goes, potentially to some obscure company in a faraway land whose management plead poverty while actually doing very nicely selling their services to anyone willing to pay; others just appear to brazenly violate the GPL and dare someone to sue them.

The Fairphone organisation could have valued the sustainability benefits of Free Software and cooperative hardware vendors. In doing so, by merely asking for informed opinions, they would have avoided this mess entirely. Unfortunately, they may have focused too narrowly only on certain worthy and necessary topics, maintaining an oversimplified view of software that, if mainstream media punditry is to be believed, is merely transient and interchangeable: something that can be made to run on any hardware as if by magic, with each upgrade replacing what was there before with something that is always better, only ever offering improvements and benefits. Those of us with more than a passing knowledge of systems development know that such beliefs are really delusions produced from a lack of experience or a wish to believe that unfamiliar things are easier than they actually are.

Since we cannot go back and change the way things were done before, I suppose that now is the time to deliver on the sustainability promise by fully and properly promoting and supporting Free Software on any future Fairphone device. Which means that the Fairphone organisation has to start listening to people with experience of reliably deploying and supporting Free Software on open or properly-documented hardware, instead of going along with whatever some supplier (and their potentially GPL-violating associates) would have them do just to get the contract in the bag and the device out of the door.

Neo900: Turning the corner

Monday, December 2nd, 2013

Back when I last wrote about the status of the Neo900 initiative, the fundraising had just begun and the target was a relatively modest €25000 by “crowdfunding” standards. That target was soon reached, but it was only ever the initial target: the sum of money required to prototype the device and to demonstrate that the device really could be made and could eventually be sold to interested customers. Thus, to communicate the further objectives of the project, the Neo900 site updated their funding status bar to show further funding objectives that go beyond mere demonstrations of feasibility and that also cover different levels of production.

So what happened here? Well, one of the slightly confusing things was that even though people were donating towards the project’s goals, it was not really possible to consider all of them as potential customers, so if 200 people had donated something (anything from, say, €10 right up to €5000), one could not really rely on them all coming back later to buy a finished device. People committing €100 or more might be considered as a likely purchaser, especially since donations of that size are effectively treated as pledges to buy and qualify for a rebate on a finished device, but people donating less might just be doing so to support the project. Indeed, people donating €100 or more might also only be doing so to support the project, but it is probably reasonable to expect that the more people have given, the more likely they are to want to buy something in the end. And, of course, if someone donates the entire likely cost of a device, a purchase has effectively been made already.

So even though the initiative was able to gauge a certain level of interest, it was not able to do so precisely purely by considering the amount of financial support it had been receiving. Consequently, by measuring donations of €100 or more, a more realistic impression of the scale of eventual production could be obtained. As most people are aware, producing things in sufficient quantity may be the only way that a product can get made: setup costs, minimum orders of components, and other factors mean that small runs of production are prohibitively expensive. With 200 effective pledges to buy, the initiative can move beyond the prototyping phase and at least consider the production phase – when they are ready, of course – without worrying too much that there will be a lack of customers.

Since my last report, media coverage has even extended into the technology mainstream, with Wired even doing a news article about it. Meanwhile, the project itself demonstrated mechanically compatible hardware and the modem hardware they intend to use, also summarising component availability and potential problems with the sourcing of certain components. For the most part, things are looking good indeed, with perhaps the only cloud on the horizon being a component with a 1000-unit minimum order quantity. That is why the project will not be stopping with 200 potential customers: the more people that jump on board, the greater the chances that everyone will be able to get a better configuration for the device.

If this were a mainstream “crowdfunding” effort, they might call that a “stretch goal”, but it is really a consequence of the way manufacturing is done these days, giving us economies of scale on the one hand, but raising the threshold for new entrants and independent efforts on the other. Perhaps we will eventually see innovations in small-scale manufacturing, not just in the widely-hyped 3D printing field, but for everything from electronic circuits to screens and cases, that may help eliminate some of the huge fixed costs and make it possible to design and make complicated devices relatively cheaply.

It will certainly be interesting to see how many more people choose to extend the lifespan of their N900 by signing up, or how many embrace the kind of smartphone that the “fickle market” supposedly does not want any more. Maybe as more people join in, more will be encouraged to join in as well, and so some kind of snowball effect might occur. Certainly, with the transparency shown in the project so far, people will at least be able to make an informed decision about whether they join in or not. And hopefully, we will eventually see some satisfied customers with open hardware running Free Software, good to go for another few years, emphasizing once again that the combination is an essential ingredient in a sustainable technological society.

Neo900: And they’re off!

Friday, November 1st, 2013

Having mentioned the Neo900 smartphone initiative previously, it seems pertinent to note that it has moved beyond the discussion phase and into the fundraising phase. Compared to the Ubuntu Edge, the goals are fairly modest – 25000 euros versus tens of millions of dollars – but the way this money will be spent has been explained in somewhat more detail than appeared to be the case for the Ubuntu Edge. Indeed, the Neo900 initiative has released a feasibility study document describing the challenges confronting the project: it contains a lot more detail than the typical “we might experience some setbacks” disclaimer on the average Kickstarter campaign page.

It’s also worth noting that as the Neo900 inherits a lot from the GTA04, as the title of the feasibility study document indicates when it refers to the device as the “GTA04b7”, and as the work is likely to be done largely within the auspices of the existing GTA04 endeavour, the fundraising is being done by Golden Delicious (the originators of the GTA04) themselves. From reading the preceding discussion around the project, popular fundraising sites appear to have conditions or restrictions that did not appeal to the project participants: Kickstarter has geographical limitations (coincidentally involving the signatory nations of the increasingly notorious UKUSA Agreement), and most fundraising sites also take a share of the raised funds. Such trade-offs may make sense for campaigns wanting to reach a large audience (and who know how to promote themselves to get prominence on such sites), but if you know who your audience is and how to reach them, and if you already have a functioning business, it could make sense to cut the big campaign sites out of the loop.

It will certainly be interesting to see what happens next. An Openmoko successor coming to the rescue of a product made by the mobile industry’s previously most dominant force: that probably isn’t what some people expected, either at Openmoko or at that once-dominant vendor.

Dell and the Hardware Vendors Page

Thursday, October 31st, 2013

Hugo complains about Dell playing around with hardware specifications on their Ubuntu-based laptop products. (Hugo has been raising some pretty interesting issues, lately!)

I think that one reason why Dell was dropped from the Hardware Vendors page on the FSFE Fellowship Wiki was that even though Dell was promoting products with GNU/Linux pre-installed, actually finding them remained a challenge involving navigating through page after page of “Dell recommends Windows Vista/Windows 8/Windows Whatever” before either finding a low-specification and overpriced afterthought of a product or the customer just giving up on the whole idea.

Every time they “embrace Linux” I’d like to think that Dell are serious – indeed, Dell manages to support enterprise distributions of GNU/Linux on servers and workstations, so they can be serious, making their antics somewhat suspiciously incompetent at the “home and small office” level – but certainly, the issue of the changing chipset is endemic: I’m pretty sure that a laptop I had to deal with recently didn’t have the advertised chipset, and I tried as hard as possible to select the exact model variant, knowing that vendors switch things out “on the quiet” even for the same model. On that occasion, it was Lenovo playing around.

The first thing any major vendor should do to be taken seriously is to guarantee that if they sell a model with a specific model number then it has a precise and unchanging specification and that both the proper model number and the specification are publicly advertised. Only then can we rely on and verify claims of compatibility with our favourite Free Software operating systems.

Until then, I can only recommend buying a system from a retailer who will stand by their product and attempt to ensure that it will function correctly with the Free Software of your choice, not only initially but also throughout a decent guarantee period. Please help us maintain the Hardware Vendors page and to support vendors and retailers who support Free Software themselves.

(Note to potential buyers and vendors: the Hardware Vendors page does not constitute any recommendation or endorsement of products or services, nor does the absence of any vendor imply disapproval of that vendor’s products. The purpose of the page is to offer information about available products and services based on the experiences and research of wiki contributors, and as such is not a marketplace or a directory where vendors may request or demand to be represented. Indeed, the best way for a vendor to be mentioned on that page is to coherently and consistently offer products that work with Free Software and that satisfy customer needs so that someone may feel happy enough with their purchase that they want to tell other people about it. Yes, that’s good old-fashioned service being recognised and rewarded: an unusual concept in the modern world of business, I’m sure.)

The inside of some random Dell computer at a former workplace - this one may not have been running GNU/Linux, but my Dell workstation was

The inside of some random Dell computer at a former workplace - this one may not have been running GNU/Linux, but my Dell workstation was

The Ben NanoNote: An Overlooked Hardware Experimentation Platform

Wednesday, October 30th, 2013

The Ben NanoNote is a pocket computer with a 3-inch screen and organiser-style keyboard announced in 2010 as the first in a line of copyleft hardware products under the Qi-Hardware umbrella: an initiative to collaboratively develop open hardware with full support for Free Software. With origins as an existing electronic dictionary product, the Ben NanoNote was customised for use as a general-purpose computing platform and produced in a limited quantity, with plans for successors that sadly did not reach full production.

The Ben NanoNote with illustrative beverage (not endorsed by anyone involved with this message, all trademarks acknowledged, call off the lawyers!)

The Ben NanoNote with illustrative beverage (not endorsed by anyone involved with this message, all trademarks acknowledged, call off the lawyers!)

When the Ben (as it is sometimes referred to in short form) first became known to a wider audience, many people focused on those specifications common to most portable devices sold today: the memory and screen size, what kind of networking it has (or doesn’t have). Some people wondered what the attraction was of a device that wasn’t wireless-capable when supposedly cheaper wireless communicator devices could be obtained. Even the wiki page for the Ben has only really prominently promoted the Free Software side of the device, mentioning its potential for making customised end-user experiences, and as an appliance for open content or for music and video playback.

Certainly, the community around the Ben has a lot to be proud of with regard to Free Software support. A perusal of the Qi-Hardware news page reveals the efforts to make sure that the Ben was (and still is) completely supported by Free Software drivers within the upstream Linux kernel distribution. With a Free Software bootloader, the Ben is probably one of the few devices that could conceivably get some kind of endorsement for the complete absence of proprietary software, including firmware blobs, from organisations like the FSF who naturally care about such things. (Indeed, a project recommended by the FSF whose output appears to be closely related to the Ben’s default software distribution publishes a short guide to installing their software on the Ben.)

But not everybody focused only on the software upon learning about the device: some articles covered the full range of ambitions and applications anticipated for the Ben and for subsequent devices in the NanoNote series. And work got underway rather quickly to demonstrate how the Ben might complement the Arduino range of electronics prototyping and experimentation boards. Although there were concerns that the interfacing potential of the Ben might be a bit limited, with only USB peripheral support available via the built-in USB port (thus ruling out the huge range of devices accessible to USB hosts), the alternatives offered by the device’s microSD port appear to offer a degree of compensation. (The possibility of using SDIO devices had been mentioned at the very beginning, but SDIO is not as popular as some might have wished, and the Ben’s microSD support seems to go only as far as providing MMC capabilities in hardware, leaving out desirable features such as hardware SPI support that would make programming slightly easier and performance substantially better. Meanwhile, some people even took the NanoNote platform to a different level by reworking the Ben, freeing up connections for interfacing and adding an FPGA, but the resulting SIE device apparently didn’t make it beyond the academic environments for which it was designed.)

Thus, the Universal Breakout Board (UBB) was conceived: a way of “breaking out” or exposing the connections of the microSD port to external devices whilst communicating with those devices in a different way than one would with SD-based cards. Indeed, the initial impetus for the UBB was to investigate whether the Ben could be interfaced to an Ethernet board and thus provide dependency-free networking (as opposed to using Ethernet-over-USB to a networked host computer or suitably configured router). Sadly, some of those missing SD-related features have an impact on performance and practicality, but that doesn’t stop the UBB from being an interesting avenue of experimentation. To distinguish between SD-related usage and to avoid trademark issues, the microSD port is usually referred to as the 8:10 port in the context of the UBB.

The Universal Breakout Board that plugs into the 8:10 (microSD) slot

The Universal Breakout Board that plugs into the 8:10 (microSD) slot

(The UBB image originates from Qi-Hardware and is CC-BY-SA 3.0 licensed.)

Interfacing in Comfort

A lot of experimentation with computer-controlled electronics is done using microcontroller solutions like those designed and produced by Arduino, whose range of products starts with the modestly specified Arduino Uno with an ATmega328 CPU providing 32K (kilobytes) of flash memory and 2K of “conventional” static RAM (SRAM). Such specifications sound incredibly limiting, and when one considers that many microcomputers in 1983 – thirty years ago – had at least 32K of “conventional” readable and writable memory, albeit not all of it being always available for use on some machines, devices such as the Uno do not seem to represent much of an advance. However, such constraints can also be liberating: programs written to run in such limited space on the “bare metal” (there being no operating system) can be conceptually simple and focus on very specific interfacing tasks. Nevertheless, the platform requires some adjustment, too: data that will not be updated while a program runs on the device must be packed away in the flash memory where it obviously cannot be changed, and data that the device manipulates or collects must be kept within the limits of the precious SRAM, bearing in mind that the program stack may also be taking up space there, too.

As a consequence, the Arduino platform benefits from a vibrant market in add-ons that extend the basic boards such as the Uno with useful capabilities that in some way make up for those boards’ deficiencies. For example, there are several “shield” add-on products that provide access to SD cards for “data logging“: essential given that the on-board SRAM is not likely to be able to log much data (and is volatile), and given that the on-board flash memory cannot be rewritten during operation. Other add-ons requiring considerable amounts of data also include such additional storage, so that display shields will incorporate storage for bitmaps that might be shown on the display: the Arduino TFT LCD Screen does precisely this by offering a microSD slot. On the one hand, the basic boards offer only what people really need as the foundational component of their projects, but this causes add-on designers to try and remedy the lack of useful core functionality at every turn, putting microSD or SD storage on every shield or extension board just because the user might not have such capabilities already.

Having said all this, the Arduino platform generally only makes you pay for what you need, and for many people this is interesting enough that it explains Arduino’s continuing success. Nevertheless, for some activities, the Arduino platform is perhaps too low-level and to build and combine the capabilities one might need for a project – to combine an Arduino board with numerous shields and other extensions – would be troublesome and possibly unsatisfactory. At some point, one might see the need to discard the “form factor” of the Arduino and to use the technological building blocks that comprise the average Arduino board – the microcontroller and other components – in order to make a more integrated, more compact device with the additional capabilities of choice. For instance, if one wanted to make a portable music player with Arduino, one could certainly acquire shields each providing a screen and controls (and microSD slot), headphone socket and audio playback (and microSD slot), and combine them with the basic board, hoping that they are compatible or working round any incompatibilities by adding yet more hardware. And then one would want to consider issues of power, whether using a simple battery-to-power-jack solution would be good enough or whether there should be a rechargeable battery with the necessary power circuit. But the result of combining an Arduino with shields would not be as practical as a more optimised device.

The Arduino Duemilanove attached to a three-axis accelerometer breakout board

The Arduino Duemilanove attached to a three-axis accelerometer breakout board

The Opportunity

In contrast to the “bare metal” approach, people have been trying and advocating other approaches in recent times. First of all, it has been realised that many people would prefer the comfort of their normal computing environment, or at least one with many of the capabilities they have come to expect, to a very basic environment that has to be told to run a single, simple program that needs to function correctly or be made to offer some rudimentary support for debugging when deployed. Those who promote solutions like the Raspberry Pi note that it runs a desktop operating system, typically GNU/Linux, and thus offers plenty of facilities for writing programs and running them in pleasant ways. Thus, interfacing with other hardware becomes more interactive and easier to troubleshoot, and it might even permit writing interfacing programs in high-level languages like Python as opposed to low-level languages like C or assembly language. In effect, the more capable platform with its more generous resources, faster processor and a genuine operating system provide opportunities that a microcontroller-based solution just cannot provide.

If this was all that were important then we would surely have reached the end of the matter, but there is more to consider. The Raspberry Pi is really a desktop computer replacement: you plug in USB peripherals and a screen and you potentially have a replacement for your existing x86-based personal computer. But it is in many ways only as portable and as mobile as the Arduino, and absolutely less so in its primary configuration. Certainly, people have done some interesting experiments adding miniature keyboards and small screens to the Raspberry Pi, but it starts to look like the situation with the Arduino when trying to build up some capability or other from too low a starting point. Such things are undoubtedly achievements in themselves, and like climbing a mountain or going into space, showing that it could be done is worthy of our approval, but just like those great achievements it would be a shame to go to all that effort without also doing a bit of science in the process, would it not?

An e-paper display connected to the Ben NanoNote via a cable and the Sparkfun "microSD Sniffer" board

An e-paper display connected to the Ben NanoNote via a cable and the Sparkfun "microSD Sniffer" board

This is where the Ben enters the picture. Because it is a pocket computer with a built-in screen and battery (and keyboard), you can perform various mobile experiments without having to gear up to be mobile in the first place. Perhaps most of the time, it may well be sitting on your desk next to a desktop computer and being remotely accessed using a secure shell connection via Ethernet-over-USB, acting as a mere accessory for experimentation. An example of this is a little project where I connected an e-paper screen to the Ben to see how hard it would be to make that screen useful. Of course, I could also take this solution “on the road” if I wanted, and it would be largely independent of any other computing infrastructure, although I will admit to not having run native compilers on the Ben myself – everything I have compiled has actually been cross-compiled using the OpenWrt toolchain targeting the Ben – but it should be possible to develop on the road, too.

But for truly mobile experimentation, where having an untethered device is part of the experiment, the Ben offers something that the Raspberry Pi and various single-board computer solutions do not: visualisation on the move. One interesting project that pioneered this is the UBB-LA (UBB Logic Analyzer) which accepts signals via the Ben’s 8:10 port and displays a time-limited capture of signal data on the screen. One area that has interested me for a while has been that of orientation and motion sensor data, with the use of gyroscopes and accelerometers to determine the orientation and motion of devices. Since there are many “breakout boards” (small boards providing convenient access to components) offering these kinds of sensors, and since the communication with these sensors is within the constraints of the 8:10 port bandwidth, it became attractive to use the Ben to prototype some software for applications which might use such data, and the Ben’s screen provides a useful way of visualising the interpretation of the data by the software. Thus, another project was conceived that hopefully provides the basis for more sophisticated experiments into navigation, interaction and perhaps even things like measurement.

The Pololu MinIMU-9 board connected to the Ben NanoNote in a horizontal position, showing the orientation information from the board

The Pololu MinIMU-9 board connected to the Ben NanoNote in a horizontal position, showing the orientation information from the board

The Differentiator

Of course, sensor applications are becoming commonplace due to the inclusion of gyroscopes, accelerometers, magnetometers and barometers into smartphones. Indeed, this was realised within the open hardware community several years ago with the production of the Openmoko Freerunner Navigation Board that featured such sensors and additional interfacing components. Augmented reality applications and fancy compass visualisations are becoming standard features on smartphones, complementing navigation data from GPS and comparable satellite navigation systems, and the major smartphone software vendors provide APIs to access such components. Indeed, many of the components used in smartphones feature on the breakout boards mentioned above, unsurprisingly, because their cost has been driven down and made them available and affordable for a wider range of applications.

So what makes the Ben NanoNote interesting when you could just buy a smartphone and code against, say, the Android API? Well, as many people familiar with software and hardware freedom will already know, being able to use an API is perhaps only enough if you never intend to improve the software providing that API, or share that software and any improvements you make to it with others, and if you never want to know what the code is really doing, anyway. Furthermore, you may not be able to change or upgrade that software or to deploy your own software on a smartphone you have bought, despite vigorous efforts to make this your right.

Even if you think that the software providers have done a good job interpreting the sensor data and translating it into something usable that a smartphone application might use, and this is not a trivial achievement by any means, you may also want to try and understand how the device interacts with the sensors at a lower level. It is rather likely that an Android smartphone, for example, will communicate with an accelerometer using a kernel module that resides in the upstream Linux kernel source code, and you could certainly take a look at that code, but as thirty years of campaigning for software freedom has shown, taking a look is not as good as improving, sharing and deploying that code yourself. Certainly, you could “root” your smartphone and install an alternative operating system that gives you the ability to develop and deploy kernel modules and even user-space code – normal programs – that access the different sensors, but you take your chances doing so.

Meanwhile, the Ben encourages experimentation by letting you re-flash the bootloader and operating system image, and you can build your own kernel and root filesystem populated with programs of your choice, all of it being Free Software and doing so using only Free Software. Things that still surprise people with modified smartphone images like being able to log in and get a secure shell session and to run “normal Linux programs” are the very essence of the Ben. It may not have wireless or cellular networking as standard – a much discussed topic that can be solved in different ways – but that can only be good news for the battery life.

The Successor?

“What would I use it for?” That might have been my first reaction to the Ben when I first heard about it. I don’t buy many gadgets – my mobile telephone is almost ten years old – and I always try to justify gadget purchases, perhaps because growing up in an age where things like microcomputers were appearing but were hardly impulse purchases, one learns how long-lived technology products can be when they are made to last and made to a high quality: their usefulness does not cease merely because a new product has become available. Having used the Ben, it is clear that although its perceived origins as some kind of dictionary, personal organiser or music player are worthy in themselves, it just happens to offer some fun as a platform for hardware experimentation.

I regard the Ben as something of a classic. It might not have a keyboard that would appeal to those who bought Psion organiser products at the zenith of that company’s influence, and its 320×240 screen seems low resolution in the age of large screen laptops and tablets with “retina” displays, but it represents something that transcends specifications and yet manages to distract people when they see one. Sadly, it seems likely that remaining stocks will eventually be depleted and that opportunities to acquire a new one will therefore cease.

Plans for direct successors of the Ben never really worked out, although somewhat related products do exist: the GCW-Zero uses an Ingenic SoC just like the Ben does, and software development for both devices engages a common community in many respects, but the form factor is obviously different. The Pandora has a similar form factor and has a higher specification but also has a higher price, but it is apparently not open hardware. The Neo900, if it comes to pass (and it hopefully will), may offer a good combination of Free Software and open hardware, but it will understandably not come cheap.

One day, well-funded organisations may recognise and reward the efforts of the open hardware pioneers. Imitating aspirational product demonstrations and trying to get in on the existing action is all very well, not to mention going your own way, but getting involved in the open hardware communities and helping them to build new things would benefit everybody. I can only hope that such organisations come to their senses soon so that more people can have the opportunity to play with sensors, robotics and all the other areas of hardware experimentation, and that a healthy diversity of platforms may be sustained to encourage such experimentation long into the future.

The Ben NanoNote and regular-sized desktop computer accessories

The Ben NanoNote and regular-sized desktop computer accessories