Paul Boddie's Free Software-related blog

Archive for the ‘competition’ Category

You can learn a lot from people’s terminology

Tuesday, June 16th, 2015

The Mailpile project has been soliciting feedback on the licensing of their product, but I couldn’t let one of the public responses go by without some remarks. Once upon a time, as many people may remember, a disinformation campaign was run by Microsoft to attempt to scare people away from copyleft licences, employing insensitive terms like “viral” and “cancer”. And so, over a decade later, here we have an article employing the term “viral” liberally to refer to copyleft licences.

Now, as many people already know, copyleft licences are applied to works by their authors so that those wishing to contribute to the further development of those works will do so in a way that preserves the “share-alike” nature of those works. In other words, the recipient of such works promises to extend to others the privileges they experienced themselves upon receiving the work, notably the abilities to see and to change how it functions, and the ability to pass on the work, modified or not, under the same conditions. Such “fair sharing” is intended to ensure that everyone receiving such works may be equal participants in experiencing and improving the work. The original author is asking people to join them in building something that is useful for everyone.

Unfortunately, all this altruism is frowned upon by some individuals and corporations who would prefer to be able to take works, to use, modify and deploy them as they see fit, and to refuse to participate in the social contract that copyleft encourages. Instead, those individuals and corporations would rather keep their own modifications to such works secret, or even go as far as to deny others the ability to understand and change any part of those works whatsoever. In other words, some people want a position of power over their own users or customers: they want the money that their users and customers may offer – the very basis of the viability of their precious business – and in return for that money they will deny their users or customers the opportunity to know even what goes into the product they are getting, never mind giving them the chance to participate in improving it or exercising control over what it does.

From the referenced public response to the licensing survey, I learned another term: “feedstock”. I will admit that I had never seen this term used before in the context of software, or I don’t recall its use in such a way, but it isn’t difficult to transfer the established meaning of the word to the context of software from the perspective of someone portraying copyleft licences as “viral”. I suppose that here we see another divide being erected between people who think they should have most of the power (and who are somehow special) and the grunts who merely provide the fuel for their success: “feedstock” apparently refers to all the software that enables the special people’s revenue-generating products with their “secret ingredients” (or “special sauce” as the author puts it) to exist in the first place.

It should be worrying for anyone spending any time or effort on writing software that by permissively licensing your work it will be treated as mere “feedstock” by people who only appreciate your work as far as they could use it without giving you a second thought. To be fair, the article’s author does encourage contributing back to projects as “good karma and community”, but then again this statement is made in the context of copyleft-licensed projects, and the author spends part of a paragraph bemoaning the chore of finding permissively-licensed projects so as to not have to contribute anything back at all. If you don’t mind working for companies for free and being told that you don’t deserve to see what they did to your own code that they nevertheless couldn’t get by without, maybe a permissive licence is a palatable choice for you, but remember that the permissive licensing will most likely be used to take privileges away from other recipients: those unfortunates who are paying good money won’t get to see how your code works with all the “secret stuff” bolted on, either.

Once upon a time, Bill Gates remarked, “A lot of customers in a sense don’t want — the notion that they would go in and tinker with the source code, that’s the opposite of what’s supposed to go on. We’re supposed to give that to them and it’s our problem to make sure that it works perfectly and they rely on us.” This, of course, comes from a man who enjoys substantial power through accumulation of wealth by various means, many of them involving the denial of choice, control and power to others. It is high time we all stopped listening to people who disempower us at every opportunity so that they can enrich themselves at our expense.

The BBC Micro and the BBC Micro Bit

Sunday, March 22nd, 2015

At least on certain parts of the Internet as well as in other channels, there has been a degree of excitement about the announcement by the BBC of a computing device called the “Micro Bit“, with the BBC’s plan to give one of these devices to each child starting secondary school, presumably in September 2015, attracting particular attention amongst technology observers and television licence fee-payers alike. Details of the device are a little vague at the moment, but the announcement along with discussions of the role of the corporation and previous initiatives of this nature provides me with an opportunity to look back at the original BBC Microcomputer, evaluate some of the criticisms (and myths) around the associated Computer Literacy Project, and to consider the things that were done right and wrong, with the latter hopefully not about to be repeated in this latest endeavour.

As the public record reveals, at the start of the 1980s, the BBC wanted to engage its audience beyond television programmes describing the growing microcomputer revolution, and it was decided that to do this and to increase computer literacy generally, it would need to be able to demonstrate various concepts and technologies on a platform that would be able to support the range of activities to which computers were being put to use. Naturally, a demanding specification was constructed – clearly, the scope of microcomputing was increasing rapidly, and there was a lot to demonstrate – and various manufacturers were invited to deliver products that could be used as this reference platform. History indicates that a certain amount of acrimony followed – a complete description of which could fill an entire article of its own – but ultimately only Acorn Computers managed to deliver a machine that could do what the corporation was asking for.

An Ambitious Specification

It is worth considering what the BBC Micro was offering in 1981, especially when considering ill-informed criticism of the machine’s specifications by people who either prefer other systems or who felt that participating in the development of such a machine was none of the corporation’s business. The technologies to be showcased by the BBC’s programme-makers and supported by the materials and software developed for the machine included full-colour graphics, multi-channel sound, 80-column text, Viewdata/Teletext, cassette and diskette storage, local area networking, interfacing to printers, joysticks and other input/output devices, as well as to things like robots and user-developed devices. Although it is easy to pick out one or two of these requirements, move forwards a year or two, increase the budget two- or three-fold, or any combination of these things, and to nominate various other computers, there really were few existing systems that could deliver all of the above, at least at an affordable price at the time.

Some microcomputers of the early 1980s
Computer RAM Text Graphics Year Price
Apple II Plus Up to 64K 40 x 25 (upper case only) 280 x 192 (6 colours), 40 x 48 (16 colours) 1979 £1500 or more
Commodore PET 4032/8032 32K 40/80 x 25 Graphics characters (2 colours) 1980 £800 (4032), £1030 (8032) (including monochrome monitor)
Commodore VIC-20 5K 22 x 23 176 x 184 (8 colours) 1980 (1981 outside Japan) £199
IBM PC (Model 5150) 16K up to 256K 40/80 x 25 640 x 200 (2 colours), 320 x 200 (4 colours) 1981 £1736 (including monochrome monitor, presumably with 16K or 64K)
BBC Micro (Model B) 32K 80/40/20 x 32/24, Teletext 640 x 256 (2 colours), 320 x 256 (2/4 colours), 160 x 256 (4/8 colours) 1981 £399 (originally £335)
Research Machines LINK 480Z 64K (expandable to 256K) 40 x 24 (optional 80 x 24) 160 x 72, 80 x 72 (2 colours); expandable to 640 x 192 (2 colours), 320 x 192 (4 colours), 190 x 96 (8 colours or 16 shades) 1981 £818
ZX Spectrum 16K or 48K 32 x 24 256 x 192 (16 colours applied using attributes) 1982 £125 (16K), £175 (48K)
Commodore 64 64K 40 x 25 320 x 200 (16 colours applied using attributes) 1982 £399

Perhaps the closest competitor, already being used in a fairly limited fashion in educational establishments in the UK, was the Commodore PET. However, it is clear that despite the adaptability of that system, its display capabilities were becoming increasingly uncompetitive, and Commodore had chosen to focus on the chipsets that would power the VIC-20 and Commodore 64 instead. (The designer of the PET went on to make the very capable, and understandably more expensive, Victor 9000/Sirius 1.) That Apple products were notoriously expensive and, indeed, the target of Commodore’s aggressive advertising did not seem to prevent them from capturing the US education market from the PET, but they always remained severely uncompetitive in the UK as commentators of the time indeed noted.

Later, the ZX Spectrum and Commodore 64 were released. Technology was progressing rapidly, and in hindsight one might have advocated waiting around until more capable and cheaper products came to market. However, it can be argued that in fulfilling virtually all aspects of the ambitious specification and pricing, it would not be until the release of the Amstrad CPC series in 1984 that a suitable alternative product might have become available. Even then, these Amstrad computers actually benefited from the experience accumulated in the UK computing industry from the introduction of the BBC Micro: they were, if anything, an iteration within the same generation of microcomputers and would even have used the same 6502 CPU as the BBC Micro had it not been for time-to-market pressures and the readily-available expertise with the Zilog Z80 CPU amongst those in the development team. And yet, specific aspects of the specification would still be unfulfilled: the BBC Micro had hardware support for Teletext displays, although it would have been possible to emulate these with a bitmapped display and suitable software.

Arise Sir Clive

Much has been made of the disappointment of Sir Clive Sinclair that his computers were not adopted by the BBC as products to be endorsed and targeted at schools. Sinclair made his name developing products that were competitive on price, often seeking cost-reduction measures to reach attractive pricing levels, but such measures also served to make his products less desirable. If one reads reviews of microcomputers from the early 1980s, many reviewers explicitly mention the quality of the keyboard provided by the computers being reviewed: a “typewriter” keyboard with keys that “travel” appear to be much preferred over the “calculator” keyboards provided by computers like the ZX Spectrum, Oric 1 or Newbury NewBrain, and they appear to be vastly preferred over the “membrane” keyboards employed by the ZX80, ZX81 and Atari 400.

For target audiences in education, business, and in the home, it would have been inconceivable to promote a product with anything less than a “proper” keyboard. Ultimately, the world had to wait until the ZX Spectrum +2 released in 1986 for a Spectrum with such a keyboard, and that occurred only after the product line had been acquired by Amstrad. (One might also consider the ZX Spectrum+ in 1984, but its keyboard was more of a hybrid of the calculator keyboard that had been used before and the “full-travel” keyboards provided by its competitors.)

Some people claim that they owe nothing to the BBC Micro and everything to the ZX Spectrum (or, indeed, the computer they happened to own) for their careers in computing. Certainly, the BBC Micro was an expensive purchase for many people, although contrary to popular assertion it was not any more expensive than the Commodore 64 upon that computer’s introduction in the UK, and for those of us who wanted BBC compatibility at home on a more reasonable budget, the Acorn Electron was really the only other choice. But it would be as childish as the playground tribalism that had everyone insist that their computer was “the best” to insist that the BBC Micro had no influence on computer literacy in general, or on the expectations of what computer systems should provide. Many people who owned a ZX Spectrum freely admit that the BBC Micro coloured their experiences, some even subsequently seeking to buy one or one of its successors and to go on to build a successful software development career.

The Costly IBM PC

Some commentators seem to consider the BBC Micro as having been an unnecessary diversion from the widespread adoption of the IBM PC throughout British society. As was the case everywhere else, the de-facto “industry standard” of the PC architecture and DOS captured much of the business market and gradually invaded the education sector from the top down, although significantly better products existed both before and after its introduction. It is tempting with hindsight to believe that by placing an early bet on the success of the then-new PC architecture, business and education could have somehow benefited from standardising on the PC and DOS applications. And there has always been the persistent misguided belief amongst some people that schools should be training their pupils/students for a narrow version of “the world of work”, as opposed to educating them to be able to deal with all aspects of their lives once their school days are over.

What many people forget or fail to realise is that the early 1980s witnessed rapid technological improvement in microcomputing, that there were many different systems and platforms, some already successful and established (such as CP/M), and others arriving to disrupt ideas of what computing should be like (the Xerox Alto and Star having paved the way for the Apple Lisa and Macintosh, the Atari ST, and so on). It was not clear that the IBM PC would be successful at all: IBM had largely avoided embracing personal computing, and although the system was favourably reviewed and seen as having the potential for success, thanks to IBM’s extensive sales organisation, other giants of the mainframe and minicomputing era such as DEC and HP were pursuing their own personal computing strategies. Moreover, existing personal computers were becoming entrenched in certain markets, and early adopters were building a familiarity with those existing machines that was reflected in publications and materials available at the time.

Despite the technical advantages of the IBM PC over much of the competition at the beginning of the 1980s, it was also substantially more expensive than the mass-market products arriving in significant numbers, aimed at homes, schools and small businesses. With many people remaining intrigued but unconvinced by the need for a personal computer, it would have been impossible for a school to justify spending almost £2000 (probably around £8000 today) on something without proven educational value. Software would also need to be purchased, and the procurement of expensive and potentially non-localised products would have created even more controversy.

Ultimately, the Computer Literacy Project stimulated the production of a wide range of locally-produced products at relatively inexpensive prices, and while there may have been a few years of children learning BBC BASIC instead of one of the variants of BASIC for the IBM PC (before BASIC became a deprecated aspect of DOS-based computing), it is hard to argue that those children missed out on any valuable experience using DOS commands or specific DOS-based products, especially since DOS became a largely forgotten environment itself as technological progress introduced new paradigms and products, making “hard-wired”, product-specific experience obsolete.

The Good and the Bad

Not everything about the BBC Micro and its introduction can be considered unconditionally good. Choices needed to be made to deliver a product that could fulfil the desired specification within certain technological constraints. Some people like to criticise BBC BASIC as being “non-standard”, for example, which neglects the diversity of BASIC dialects that existed at the dawn of the 1980s. Typically, for such people “standard” equates to “Microsoft”, but back then Microsoft BASIC was a number of different things. Commodore famously paid a one-off licence fee to use Microsoft BASIC in its products, but the version for the Commodore 64 was regarded as lacking user-friendly support for graphics primitives and other interesting hardware features. Meanwhile, the MSX range of microcomputers featured Microsoft Extended BASIC which did provide convenient access to hardware features, although the MSX range of computers were not the success at the low end of the market that Microsoft had probably desired to complement its increasing influence at the higher end through the IBM PC. And it is informative in this regard to see just how many variants of Microsoft BASIC were produced, thanks to Microsoft’s widespread licensing of its software.

Nevertheless, the availability of one company’s products do not make a standard, particularly if interoperability between those products is limited. Neither BBC BASIC nor Microsoft BASIC can be regarded as anything other than de-facto standards in their own territories, and it is nonsensical to regard one as non-standard when the other has largely the same characteristics as a proprietary product in widespread use, even if it was licensed to others, as indeed both Microsoft BASIC and BBC BASIC were. Genuine attempts to standardise BASIC did indeed exist, notably BASICODE, which was used in the distribution of programs via public radio broadcasts. One suspects that people making casual remarks about standard and non-standard things remain unaware of such initiatives. Meanwhile, Acorn did deliver implementations of other standards-driven programming languages such as COMAL, Pascal, Logo, Lisp and Forth, largely adhering to any standards subject to the limitations of the hardware.

However, what undermined the BBC Micro and Acorn’s related initiatives over time was the control that they as a single vendor had over the platform and its technologies. At the time, a “winner takes all” mentality prevailed: Commodore under Jack Tramiel had declared a “price war” on other vendors and had caused difficulties for new and established manufacturers alike, with Atari eventually being sold to Tramiel (who had resigned from Commodore) by Warner Communications, but many companies disappeared or were absorbed by others before half of the decade had passed. Indeed, Acorn, who had released the Electron to compete with Sinclair Research at the lower end of the market, and who had been developing product lines to compete in the business sector, experienced financial difficulties and was ultimately taken over by Olivetti; Sinclair, meanwhile, experienced similar difficulties and was acquired by Amstrad. In such a climate, ideas of collaboration seemed far from everybody’s minds.

Since then, the protagonists of the era have been able to reflect on such matters, Acorn co-founder Hermann Hauser admitting that it may have been better to license Acorn’s Econet local area networking technology to interested competitors like Commodore. Although the sentiments might have something to do with revenues and influence – it was at Acorn that the ARM processor was developed, sowing the seeds of a successful licensing business today – the rest of us may well ask what might have happened had the market’s participants of the era cooperated on things like standards and interoperability, helping their customers to protect their investments in technology, and building a bigger “common” market for third-party products. What if they had competed on bringing technological improvements to market without demanding that people abandon their existing purchases (and cause confusion amongst their users) just because those people happened to already be using products from a different vendor? It is interesting to see the range of available BBC BASIC implementations and to consider a world where such building blocks could have been adopted by different manufacturers, providing a common, heterogeneous platform built on cooperation and standards, not the imposition of a single hardware or software platform.

But That Was Then

Back then, as Richard Stallman points out, proprietary software was the norm. It would have been even more interesting had the operating systems and the available software for microcomputers been Free Software, but that may have been asking too much at the time. And although computer designs were often shared and published, a tendency to prevent copying of commercial computer designs prevailed, with Acorn and Sinclair both employing proprietary integrated circuits mostly to reduce complexity and increase performance, but partly to obfuscate their hardware designs, too. Thus, it may have been too much to expect something like the BBC Micro to have been open hardware to any degree “back in the day”, although circuit diagrams were published in publicly-available technical documentation.

But we have different expectations now. We expect software to be freely available for inspection, modification and redistribution, knowing that this empowers the end-users and reassures them that the software does what they want it to do, and that they remain in control of their computing environment. Increasingly, we also expect hardware to exhibit the same characteristics, perhaps only accepting that some components are particularly difficult to manufacture and that there are physical and economic restrictions on how readily we may practise the modification and redistribution of a particular device. Crucially, we demand control over the software and hardware we use, and we reject attempts to prevent us from exercising that control.

The big lesson to be learned from the early 1980s, to be considered now in the mid-2010s, is not how to avoid upsetting a popular (but ultimately doomed) participant in the computing industry, as some commentators might have everybody believe. It is to avoid developing proprietary solutions that favour specific organisations and that, despite the general benefits of increased access to technology, ultimately disempower the end-user. And in this era of readily available Free Software and open hardware platforms, the lesson to be learned is to strengthen such existing platforms and to work with them, letting those products and solutions participate and interoperate with the newly-introduced initiative in question.

The BBC Micro was a product of its time and its development was very necessary to fill an educational need. Contrary to the laziest of reports, the Micro Bit plays a different role as an accessory rather than as a complete home computer, at least if we may interpret the apparent intentions of its creators. But as a product of this era, our expectations for the Micro Bit are greater: we expect transparency and interoperability, the ability to make our own (just as one can with the Arduino, as long as one does not seek to call it an Arduino without asking permission from the trademark owner), and the ability to control exactly how it works. Whether there is a need to develop a completely new hardware solution remains an unanswered question, but we may assert that should it be necessary, such a solution should be made available as open hardware to the maximum possible extent. And of course, the software controlling it should be Free Software.

As we edge gradually closer to September and the big deployment, it will be interesting to assess how the device and the associated initiative measures up to our expectations. Let us hope that the right lessons from the days of the BBC Micro have indeed been learned!

Lenovo: What Were They Thinking?

Wednesday, February 25th, 2015

In the past few days, there have been plenty of reports of Lenovo shipping products with a form of adware known as Superfish, originating from a company of the same name, that interferes with the normal operation of Web browser software to provide “shopping suggestions” in content displayed by the browser. This would be irritating enough all by itself, but what made the bundled software involved even more worrying was that it also manages to insert itself as an eavesdropper on the user’s supposedly secure communications, meaning that communications conducted between the user and Internet sites such as online banks, merchants, workplaces and private-and-confidential services are all effectively compromised.

Making things even worse still, the mechanism employed to pursue this undesirable eavesdropping managed to prove highly insecure in itself, exposing Lenovo customers to attack from others. So, we start this sordid affair with a Lenovo “business decision” about bundling some company’s software and end up with Lenovo’s customers having their security compromised for the dubious “benefit” of being shown additional, unsolicited advertisements in Web pages that didn’t have them in the first place. One may well ask what Lenovo’s decision-makers were thinking?

Symptoms of a Disease

Indeed, this affair gives us a fine opportunity to take a critical look at the way the bundling of software has corrupted the sale of personal computers for years, if not decades. First of all, most customers have never been given a choice of operating system or to be able to buy a computer without an operating system, considering the major channels and vendors to which most buyers are exposed: the most widely-available and widely-advertised computers only offer some Windows variant, and manufacturers typically insist that they cannot offer anything else – or even nothing at all – for a variety of feeble reasons. And when asked to provide a refund for this unwanted product that has been forced on the purchaser, some manufacturers even claim that it is free or that someone else has subsidised the cost, and that there is no refund to be had.

This subsidy – some random company acting like a kind of wealthy distant relative paying for the “benefit” of bundled proprietary software – obviously raises competition-related issues, but it also raises the issue of why anyone would want to pay for someone else to get something at no cost. Even in a consumer culture where getting more goodies is seen as surely being a good thing because it means more toys to play with, one cannot help but be a little suspicious: surely something is too good to be true if someone wants to give you things that they would otherwise make you pay for? And now we know that it is: the financial transaction that enriched Lenovo was meant to give Superfish access to its customers’ sensitive information.

Of course, Lenovo’s updated statement on the matter (expect more updates, particularly if people start to talk about class action lawsuits) tries to downplay the foul play: the somewhat incoherent language (example: “Superfish technology is purely based on contextual/image and not behavioral”) denies things like user profiling and uses terminology that is open to quite a degree of interpretation (example: “Users are not tracked nor re-targeted”). What the company lawyers clearly don’t want to talk about is what information was being collected and where it was being whisked off to, keeping the legal attack surface minimal and keeping those denials of negligence strenuous (“we did not know about this potential security vulnerability until yesterday”). Maybe some detail about those “server connections shut down in January” would shed some light on these matters, but the lawyers know that with that comes the risk of exposing a paper trail showing that everybody knew what they were getting into.

Your Money isn’t Good Enough

One might think that going to a retailer, giving them your money, and getting a product to take home would signal the start of a happy and productive experience with a purchase. But it seems that for some manufacturers, getting the customer’s money just isn’t enough: they just have to make a bit of money on the side, and perhaps keep making money from the product after the customer has taken it home, too. Consumer electronics and products from the “content industries” have in particular fallen victim to the introduction of advertising. Even though you thought you had bought something outright, advertisements and other annoyances sneak into the experience, often in the hope that you will pay extra to make them go away.

And so, you get the feeling that your money somehow isn’t good enough for these people. Maybe if you were richer or knew the right people, your money would be good enough and you wouldn’t need to suffer adverts or people spying on you, but you aren’t rich or well-connected and just have to go along with the indignity of it all. Naturally, the manufacturers would take offence at such assertions; they would claim that they have to take bribes subsidies to be able to keep their own prices competitive with the rest of the market, and of course everybody else is taking the money. That might be almost believable if it weren’t for the fact that the prices of things like bundled operating systems and “productivity software” – the stuff that you can’t get a refund for – are completely at the discretion of the organisations who make it. (It also doesn’t help these companies that they seem to be unable to deliver a quality product with a stable set of internal components, or that they introduce stupid hardware features that make their products excruciating to use.)

Everybody Hurts

For the most part, it probably is the case that if you are well-resourced and well-connected, you can buy the most expensive computer with the most expensive proprietary software for it, and maybe the likes of Lenovo won’t have tainted it with their adware-of-the-month. But naturally, proprietary software doesn’t provide you with any inherent assurances that it hasn’t been compromised: only Free Software can offer you that, and even then you must be able to insist on the right to be able to build and install that software on the hardware yourself. Coincidentally, I did once procure a Lenovo computer from a retailer that only supplied them with GNU/Linux preinstalled, with Lenovo being a common choice amongst such retailers because the distribution channel apparently made it possible for them to resell such products without Windows or other proprietary products ever becoming involved.

But sometimes the rich and well-connected become embroiled in surveillance and spying in situations of their own making. Having seen people become so infatuated with Microsoft Outlook that they seemingly need to have something bearing the name on every device they use, it is perhaps not surprising that members of the European Parliament had apparently installed Microsoft’s mobile application bearing the Outlook brand. Unfortunately for them, Microsoft’s “app” sends sensitive information including their authentication credentials off into the cloud, putting their communications (and the safety of their correspondents, in certain cases) at risk.

Some apologists may indeed claim that Microsoft and their friends and partners collecting everybody’s sensitive details for their own convenience is “not an issue for the average user”, but in fact it is a huge issue: when people become conditioned into thinking that surrendering their privacy, accepting the inconveniences of intrusive advertising, always being in debt to the companies from which they have bought things (even when those purchases have actually kept those companies in business), and giving up control of their own belongings are all “normal” things and that they do not deserve any better, then we all start to lose control over the ways in which we use technology as well as the technologies we are able to use. Notions of ownership and democracy quickly become attacked and eroded.

What Were They Thinking?

We ultimately risk some form of authority, accountable or otherwise, telling us that we no longer deserve to be able to enjoy things like privacy. Their reasons are always scary ones, but in practice it usually has something to do with them not wanting ordinary people doing unexpected or bothersome things that might question or undermine their own very comfortable (and often profitable) position telling everybody else what to do, what to worry about, what to buy, and so on. And it turns out that a piece of malware that just has to see everything in its rampant quest to monetize every last communication of the unwitting user now gives us a chance to really think about how we really want our computers and their suppliers to behave.

So, what were they thinking at Lenovo? That Superfish was an easy way to make a few extra bucks? That their customers don’t deserve anything better than to have their private communications infused with advertising? That their customers don’t need to know that people are tampering with their Internet connection? That the private information of their customers was theirs to sell to anyone offering them some money? Did nobody consider the implications of any of this at all, or was there a complete breakdown in ethics amongst those responsible? Was it negligence or contempt for their own customers that facilitated this pursuit of greed?

Sadly, the evidence from past privacy scandals involving major companies indicates that regulatory or criminal proceedings are unlikely, merely fuelling suspicions that supposed corporate incompetence – the existence of conveniently unlocked backdoors – actually serves various authorities rather nicely. It is therefore up to us to remain vigilant and, of course, to exercise our own forms of reward for those who act in our interests, along with punishment for those whose behaviour is unacceptable in a fair and democratic society.

Maybe after a break from seeing any of it for a while, our business and our money will matter more to Lenovo than that of some shady “advertising” outfit with the dubious and slightly unbelievable objective of showing more adverts to people while they do their online banking. And by then, maybe Lenovo (and everyone else) will let us install whatever software we like on their products, because many people aren’t going to be trusting the bundled software for a long time to come after this. Not that they should ever have trusted it in the first place, of course.

Making the Best of a Bad Deal

Wednesday, January 7th, 2015

I had the opportunity over the holidays to browse the January 2015 issue of “Which?” – the magazine of the Consumers’ Association in Britain – which, amongst other things, covered the topic of “technology ecosystems“. Which? has a somewhat patchy record when technology matters are taken into consideration: on the one hand, reviews consider the practical and often mundane aspects of gadgets such as battery life, screen brightness, and so on, continuing their tradition of giving all sorts of items a once over; on the other hand, issues such as platform choice and interoperability are typically neglected.

Which? is very much pitched at the “empowered consumer” – someone who is looking for a “good deal” and reassurances about an impending purchase – and so the overriding attitude is one that is often in evidence in consumer societies like Britain: what’s in it for me? In other words, what goodies will the sellers give me to persuade me to choose them over their competitors? (And aren’t I lucky that these nice companies are throwing offers at me, trying to win my custom?) A treatment of ecosystems should therefore be somewhat interesting reading because through the mere use of the term “ecosystem” it acknowledges that alongside the usual incentives and benefits that the readership is so keen to hear about, there are choices and commitments to be made, with potentially negative consequences if one settles in the wrong ecosystem. (Especially if others are hell-bent on destroying competing ecosystems in a “war” as former Nokia CEO Stephen Elop – now back at Microsoft, having engineered the sale of a large chunk of Nokia to, of course, Microsoft – famously threatened in another poor choice of imagery as part of what must be the one of the most insensitively-formulated corporate messages of recent years.)

Perhaps due to the formula behind such articles in Which? and similar arenas, some space is used to describe the benefits of committing to an ecosystem.  Above the “expert view” describing the hassles of switching from a Windows phone to an Android one, the title tells us that “convenience counts for a lot”. But the article does cover problems with the availability of applications and services depending on the platform chosen, and even the matter of having to repeatedly buy access to content comes up, albeit with a disappointing lack of indignance for a topic that surely challenges basic consumer rights. The conclusion is that consumers should try and keep their options open when choosing which services to use. Sensible and uncontroversial enough, really.

The Consequences of Apathy

But sadly, Which? is once again caught in a position of reacting to technology industry change and the resulting symptoms of a deeper malaise. When reviewing computers over the years, the magazine (and presumably its sister publications) always treated the matter of platform choice with a focus on “PCs and Macs” exclusively, with the latter apparently being “the alternative” (presumably in a feeble attempt to demonstrate a coverage of choice that happens to exist in only two flavours). The editors would most likely protest that they can only cover the options most widely available to buy in a big-name store and that any lack of availability of a particular solution – say, GNU/Linux or one of the freely available BSDs – is the consequence of a lack of consumer interest, and thus their readership would also be uninterested.

Such an unwillingness to entertain genuine alternatives, and to act in the interests of members of their audience who might be best served by those solutions, demonstrates that Which? is less of a leader in consumer matters than its writers might have us believe. Refusing to acknowledge that Which? can and does drive demand for alternatives, only to then whine about the bundled products of supposed consumer interest, demonstrates a form of self-imposed impotence when faced with the coercion of proprietary product upgrade schedules. Not everyone – even amongst the Which? readership – welcomes the impending vulnerability of their computing environment as another excuse to go shopping for shiny new toys, nor should they be thankful that Which? has done little or nothing to prevent the situation from occurring in the first place.

Thus, Which? has done as much as the rest of the mainstream technology press to unquestioningly sustain the monopolistic practices of the anticompetitive corporate repeat offender, Microsoft, with only a cursory acknowledgement of other platforms in recent years, qualified by remarks that Free Software alternatives such as GNU/Linux and LibreOffice are difficult to get started with or not what people are used to. After years of ignoring such products and keeping them marginalised, this would be the equivalent of denying someone the chance to work and then criticising them for not having a long list of previous employers to vouch for them on their CV.  Noting that 1.1-billion people supposedly use Microsoft Office (“one in seven people on the planet”) makes for a nice statistic in the sidebar of the print version of the article, but how many people have a choice in doing so or, for that matter, in using other Microsoft products bundled with computers (or foisted on office workers or students due to restrictive or corrupt workplace or institutional policies)? Which? has never been concerned with such topics, or indeed the central matter of anticompetitive software bundling, or its role in the continuation of such practices in the marketplace: strange indeed for a consumer advocacy publication.

At last, obliged to review a selection of fundamentally different ecosystem choices – as opposed to pretending that different vendor badges on anonymous laptops provide genuine choice – Which? has had to confront the practical problems brought about by an absence of interoperability: that consumers might end up stranded with a large, non-transferable investment in something they no longer wish to be a part of. Now, the involvement of a more diverse selection of powerful corporate interests have made such matters impossible to ignore. One gets the impression that for now at least, the publication cannot wish such things away and return to the lazy days of recommending that everyone line up and pay a single corporation their dues, refusing to believe that there could be anything else out there, let alone usable Free Software platforms.

Beyond Treating the Symptoms

Elsewhere in the January issue, the latest e-mail scam is revealed. Of course, a campaign to widen the adoption of digitally-signed mail would be a start, but that is probably too much to expect from Which?: just as space is dedicated to mobile security “apps” in this issue, countless assortments of antivirus programs have been peddled and reviewed in the past, but solving problems effectively requires a leader rather than a follower. Which? may do the tedious job of testing kettles, toasters, washing-up liquids, and much more to a level of thoroughness that would exhaust most people’s patience and interest. And to the publication’s credit, a certain degree of sensible advice is offered on topics such as online safety, albeit with the usual emphasis on proprietary software for the copy of Windows that members of its readership were all forced to accept. But in technology, Which? appears to be a mere follower, suggesting workarounds rather than working to build a fair market for safe, secure and interoperable products.

It is surely time for Which? to join the dots and to join other organisations in campaigning for fundamental change in the way technology is delivered by vendors and used throughout society. Then, by upholding a fair marketplace, interoperability, access to digital signature and encryption technologies, true ownership of devices and of purchased content, and all the things already familiar to Free Software and online rights advocates, they might be doing their readership a favour after all.

Når sannheten kommer frem

Monday, April 28th, 2014

For noen måneder siden skrev jeg om avgjørelsen foretatt av ledelsen til Universitetet i Oslo å innføre Microsoft Exchange som gruppevareløsning. I teksten viste jeg til kommentarer som handlet om Roundcube og hvorfor den løsningen ikke kunne brukes i fremtiden hos UiO. Nå etter at universitetets IT-avdeling publiserte en nyhetssak som omtaler SquirrelMail, får jeg muligheten til å grave opp og dele noen detaljer fra mine samtaler med talspersonen tilknyttet prosjektet som vurderte Exchange og andre løsninger.

Når man ser rundt på nettet, virker det ikke uvanlig at organisasjoner innførte Roundcube parallelt med SquirrelMail på grunn av bekymringer om “universell utforming” (UU). Men i samtaler med prosjektet ble jeg fortalt at diverse mangler i UU-støtte var en aktuell grunn for at Roundcube ikke kunne bli en del av en fremtidig løsning for webmail hos UiO. Nå kommer sannheten frem:

“Ved innføringen av Roundcube som UiOs primære webmail-tjeneste fikk SquirrelMail lov til å leve videre i parallell fordi Roundcube hadde noen mangler knyttet til universell utforming. Da disse var forbedret hadde ledelsen besluttet at e-post og kalender skulle samles i et nytt system.”

Det finnes to ting som fanger vår oppmerksomhet her:

  1. At Roundcube hadde mangler “ved innføringen”, mens Roundcube hadde vært i bruk i noen år før den tvilsomme prosessen ble satt i gang for å vurdere e-post og kalenderløsninger.
  2. At forbedringene kom tilfeldigvis for sent for å påvirke ledelsens beslutning å innføre Exchange.

I fjor sommer, uten å dele prosjektgruppens påstander om mangler i Roundcube direkte i offentlighet, hørte jeg med andre om det fantes kjente mangler og forbedringspotensial i Roundcube i dette området. Var UU virkelig noe som hindret utbredelsen av Roundcube? Jeg satte meg inn i UU-teknologier og prøvde noen av dem med Roundcube for å se om situasjonen kunne forbedres med egen innsats. Det kan hende at det var store mangler i Roundcube tilbake i 2011 da prosjektgruppen begynte sitt arbeid – jeg velger ikke å hevde noe slikt – men etter at slike mangler ikke kom frem i 2012 i prosjektets sluttrapport (der Roundcube faktisk ble anbefalt som en del av den åpne kandidaten), må vi konkludere at slike bekymringer var for lengst borte og at universitetets egen webmail-tjeneste, selv om den er tilpasset organisasjonens egen visuelle profil (som kan ha noe å si om vedlikehold), var og fortsatt er tilgjengelig for alle brukere.

Hvis vi våger å tro at utdraget ovenfor forteller sannheten må vi nå konkludere at ledelsens beslutning fant sted lenge før selve prosessen som skulle underbygge denne beslutningen ble avsluttet. Og her må vi anse ordene “ledelsen besluttet” i et annet lys enn det som ellers er vanlig – der ledelsen først drar nytte av kompetanse i organisasjonen, og så tar en informert beslutning – med å anta at, som tidligere skrevet, noen “måtte ha” noe de likte, tok beslutningen de ville ta uansett, og så fikk andre til å finne på unnskyldninger og et grunnlag som virker fornuftig nok overfor utenforstående.

Det er én sak at en tilstrekkelig og fungerende IT-løsning som også er fri programvare svartmales mens tilsynelatende upopulære proprietære løsninger som Fronter når ikke opp slik at problemer rapportert i 2004 ble omtalt igjen i 2012 uten at produsenten gjør noe vesentlig mer enn å love å “jobbe med tilgjengelighet” fremover. Det er en annen sak at bærekraftige investeringer i fri programvare virker så fremmed til beslutningstakere hos UiO at folk heller vil snu arbeidsdagen til den vanlige ansatte opp-ned, slik utskiftningen av e-postinfrastrukturen har gjort for noen, enn å undersøke muligheten til å foreta relativt små utviklingsoppgaver for å oppgradere eksisterende systemer, og slippe at folk må “[stå] på dag og natt” i løpet av “de siste månedene” (og her viser vi selvsagt ikke til folk i ledelsen).

At ansatte i IT-avdelingen fikk munnkurv omkring prosessen og måtte forlange at deler av beslutningsgrunnlaget ikke når frem til offentligheten. At “menige” IT-ansvarlige må løpe hit og dit for å sørge for at ingen blir altfor misfornøyd med utfallet til en avgjørelse hverken de eller andre vanlige ansatte hadde vesentlig innflytelse på. At andre må tilpasse seg preferansene til folk som egentlig burde hatt ingenting å si om hvordan andre utfører sitt daglig arbeid. Alt dette sier noe om ledelseskulturen og demokratiske tilstander internt i Universitetet i Oslo.

Men sannheten kommer langsomt frem.

Mobile Tethering, Privacy and Predatory Practices

Saturday, April 26th, 2014

Daniel Pocock provides some fairly solid analogies regarding arbitrary restrictions on mobile network usage, but his blog system seems to reject my comment, so here it is, mostly responding to Adam Skutt’s remark on “gas-guzzlers” (cars that needlessly consume more petrol/gasoline than they really need to).

The second example or analogy mentions “exotic” cars, not necessarily gas-guzzling ones. The point being highlighted is that when producers can ascertain or merely speculate that customers can afford higher prices, they may decide to exploit those customers; things like “tourist prices” are another example of such predatory practices.

I think the first example is a fairly solid rebuttal of the claim that this is all about likely bandwidth consumption (and that tethered devices would demand more than mobile devices). Just as the details of the occupants of a vehicle should be of no concern to a petrol station owner, so should the details of network-using programs be of no concern to a mobile network operator.

Operators are relying on the assumption that phones are so restricted that their network usage will be constrained accordingly, but this won’t be the case forever and may not even be the case now. They should stop pining for the days when phones were totally under their own control, with every little feature being a paid upgrade to unlock capabilities that the device had from the moment it left the factory.

I know someone whose carrier-locked phone wouldn’t share pictures over Bluetooth whereas unlocked phones of the same type would happily do so. Smartphones are said to be a computer in one’s pocket: this means that we should also fight to uphold the same general purpose computing rights that, throughout the years, various organisations have sought to deny us all from our desktop and laptop computers.