Common Threads of Minicomputing History
Friday, January 9th, 2026In the past few years, in my exploration of computing history, the case of the Norwegian computing manufacturer Norsk Data has been a particular fascination. Growing up in 1980s Britain, it is entirely possible that the name will have appeared in newspapers and magazines I may have seen or read, although I cannot remember any particular occurrences. It is also easy to mix it up with Nokia Data: a company that was eventually acquired by the United Kingdom’s own computing behemoth, ICL.
Looking back, however, and it turns out that Norsk Data even managed to get systems into institutions not too many miles from where I grew up, and the company did have a firm commercial presence in the UK, finding niches in various industries and forms of endeavour. Having now lived in Norway for a considerable amount of time, it is perhaps more surprising that Norsk Data is almost as forgotten, and leaves almost as few traces, in its home country.
When I arrived in Norway, I gave no thought whatsoever to Norsk Data, even though I had been working at an organisation that had been one of the company’s most prominent customers and the foundation for its explosive growth during the 1970s and 1980s. But my own path through the Norwegian computing sector may well have crossed those of the company’s many previous employees, and in fact, one former employer of mine was part of a larger group that had acquired parts of the disintegrating Norsk Data empire.
It might come as a surprise that a company with over 4000 employees at its peak, many of them presumably in Oslo, and with annual revenues of almost 3 billion Norwegian crowns (around $450 million), would crumble within years and leave so little behind to show for itself. Admittedly, some of the locations of the company’s facilities have been completely redeveloped in recent years. But one might have expected an enduring cultural or social legacy.
In looking back, we might make some observations about a phenomenon that shares certain elements with events in other countries and other companies, along with more general observations about technological aspiration, contrasting the aspirations of that earlier era with today’s “innovation” culture, where companies arguably have much more mundane goals.
Big Claims by Small People
One of my motivations for looking into the history of Norsk Data arose from studying some of the rhetoric about its achievements and its influences on mainstream technology and wider society, these intersecting with CERN and the World Wide Web. There are some that have dared to claim that the Web was practically invented on Norsk Data systems, and with that, imaginations run riot and other bold claims are made. I personally strongly dislike such behaviour.
When Apple devotees, for example, insist that Apple invented a range of technologies, the obligation is then put on others to correct the ignorant statements concerned and to act to prevent the historical record from being corrupted. So, no, Apple did not “invent” overlapping windows. And when corrected, one finds oneself obliged to chase down all the usual caveats and qualifications in response that are so often condensed into “but really they did”. So, no, Apple were not the first to demonstrate systems where the background windows remained “live” and updated, either.
Why can’t people be satisfied with the achievements that were made by their favourite companies? Is it not enough to respect the work actually done, instead of extrapolating and maximising a claim that then extends to a claim of “invention” and thus dominance? Such behaviour is not only disrespectful to the others who also did such work and made such discoveries, potentially at an earlier time, but it is disrespectful to the collaborative environment of the era, many of whose participants would not have seen themselves as adversaries. It is even disrespectful to the idols of the devotees making their exaggerated claims.
And if people revisited history, instead of being so intent on rewriting it, they might learn that such claims were litigated – literally – in decades past. Attempts to exclude other companies from delivering common technologies left Apple with little more than a trashcan. Maybe the company’s lawyers had wished that the perverse gesture of dragging a disk icon to a dustbin icon to eject a floppy disk might, for once, have just erased the company’s opportunistic, wasteful and flimsy lawsuit.
Questions of Heritage
What intrigued me most were some of the claims by Norsk Data itself. The company started out in the late 1960s, introducing the Nord-1, a 16-bit minicomputer, for industrial control applications. Numerous claims of “firsts” are made for that model in the context of minicomputing (virtual memory, built-in floating-point arithmetic support), perhaps contentious and subject to verification themselves, but it was the introduction of its successor where such claims start to tread on more delicate territory.
The Nord-5, introduced in 1972, has occasionally been claimed as the first 32-bit minicomputer. In fact, it could only operate in conjunction with a Nord-1, with the combination potentially being regarded as a minicomputing system. At the time, and for the handful of customers involved, this combination was described as the NORDIC system: a name that was apparently not used much if ever again. In practice, this was one or more 16-bit minicomputers with an attached 32-bit arithmetic processor.
Such clarifications might seem pedantic, but people do have strong opinions on such matters. Whereas Digital Equipment Corporation’s VAX, introduced in 1977, might be regarded as an influential machine in the proliferation of 32-bit minicomputing, occasionally and incorrectly cited as the first system of its kind, it is generally conceded that the Interdata 7/32 and 8/32, introduced in 1973, have a more substantial claim on any such title. Certainly, these may well have been the first minicomputers priced at $10,000 or below. Meanwhile, the NORDIC system cost over $600,000 for the Norwegian Meteorological Institute to acquire.
One might argue that NORDIC was not a typical minicomputing system, nor priced accordingly. And it does raise the observation that if one is to attach a component with certain superior characteristics to an existing component, as much as this attached component complements the capabilities of the existing component, the combination is not necessarily equivalent to a coherent system built entirely with such superior characteristics in the first place. We may return to this topic later, not least because certain phenomena have a habit of recurring in the computing industry.
As much as one might say in categorising the Nord-5, it was an interesting machine. Thanks to those who took an interest in archiving Norsk Data’s heritage, we are able to look at descriptions of the machine’s architecture, its instruction set, and so on. For those who have encountered systems from an earlier time and found them constraining and austere, the Nord-5 is surprising in a few ways. Most prominently, it has 64 general-purpose registers of 32 bits in size, pairs of which may be grouped to form 64-bit floating-point registers where required.
The Nord-5 has only a small number of instruction formats, although some of them seem rather haphazardly organised. It turns out that this is where the machine’s implementation, based heavily on discrete logic integrated circuits and the SN74181 arithmetic logic unit in particular, dictates the organisation of the machine. One might have thought that the limitations of the technology would have restrained the designers, making them focus on a limited feature set so as to minimise chip count and system cost, but exotic functionality exists that is difficult to satisfactorily explain or rationalise at first glance.
For instance, indirect addressing, familiar from various processor architectures, tends to involve an instruction accessing a particular memory location (or pair of locations), reading the contents of that location (or those locations), and then treating this value (or those values) as a memory address. Normally, one would then operate on the contents of this final address. However, in the Nord-5 architecture, such indirection can be done over and over again, so that instead of just one value being loaded and interpreted as an address, the value found at this address may be interpreted as an address, and its value may be interpreted as an address. And so on, for a maximum of sixteen levels, all traversed upon executing a single instruction over a number of clock cycles!
I must admit that I am not particularly familiar with mainframe and minicomputer architectures, but certain characteristics do seem similar to other machines. For example, the PDP-10 or DECsystem-10, a 36-bit mainframe from Digital introduced in 1966, has sixteen general-purpose registers and only two instruction formats. It also has floating-point arithmetic support using pairs of registers. Later, Digital would discontinue this line of computers in favour of its increasingly popular and profitable VAX range of computers: a development that would parallel Norsk Data’s own technological strategy in some ways.
The Nord-5 and its successor, the largely similar Nord-50, were regarded as commercially unsuccessful, although one might argue that the former gave the company access to funding at a crucial point in its history. They also delivered respectable floating-point arithmetic performance, bringing about considerations of making them available for minicomputers from other manufacturers. Described in some reporting as “a cheaper smaller scale version of the CDC Cyber 76 or Cray-1“, even if we ignore the hype, one can consider how pursuing this floating-point accelerator business might have influenced the eventual fate of the company.
Role Models and Rivals
In potentially lucrative sales environments like CERN, where Norsk Data gained a lucrative foothold during the 1970s, the company would have seen a lot of business going the way of companies like Digital, IBM and Hewlett-Packard. Such companies would have been almost like role models, indicating areas in which Norsk Data might operate, and providing recipes for winning business and keeping it.
Indeed, when discussing Norsk Data, it is almost impossible to avoid introducing Digital Equipment Corporation into the discussion, not least because Norsk Data constantly made comparisons of itself with Digital, favourably compared its products to those of Digital, and quite clearly aspired to be like Digital to the point of seemingly trying to emulate the more established company. However, it might be said that this approach rather depended on what Digital’s own strategy was perceived to be, and whether the people at Norsk Data actually understood Digital’s business and products.
Much has been written about Digital’s own fall from grace, being a company with sought-after products that helped define an industry, only to approach the end of the 1980s in a state of near crisis, with its products being outperformed by the insurgent Unix system vendors and with its own customers wanting a more convincing Unix story from their supplier. In certain respects, Norsk Data’s fortunes followed a similar path, and we might then be left wondering if in trying to be like Digital, the company inadvertently copied its larger rival’s flaws and replicated its mistakes.
One apparent perception of Digital was that of a complete provider of technology, and it is certainly apparent that Digital was the kind of supplier who would gladly provide everything from hardware and the operating system, through compilers, tools and productivity applications, all the way to removal services for computing facilities. Certainly, computing investments at the minicomputing and mainframe level were considerable, and having a capable vendor was essential.
It was apparently often remarked that “nobody ever got fired for buying IBM”, but it could also be said that buying IBM meant that a whole category of worries could be placed on the supplier. Indeed, Digital was perceived as only offering potential solutions through their technology, as opposed to the kind of complete, working solutions that IBM would be selling. Nevertheless, opportunities were identified in various areas where the bulk of such solutions were ready to deploy. Digital sought to enter the office automation market with its ALL-IN-1 software, competing with IBM’s established products. Naturally, Norsk Data wanted a piece of this action, too.
The business model was not the only way that Norsk Data seemed obsessed with Digital. Company financial reports highlighted the superior growth figures in comparison to Digital and other computer companies. The introduction of the VAX in 1977 demanded a response, and the company set to work on a genuine 32-bit “superminicomputer” as a result. This effort dragged out, however, only eventually delivering the ND-500 series in 1981.
The ND-500 introduced a new architecture incompatible with that of the Nord-5 and Nord-50, trading the large, general-purpose register set with a smaller set of registers partitioned into specialised groups acting as accumulators, index registers, extension registers, base registers, stack and frame registers, and so on. Although resembling an extended form of Norsk Data’s 16-bit architecture, no effort had been made to introduce instruction set compatibility between the ND-500 and that existing architecture.
The instruction set itself aimed for the orthogonality for which the VAX had become famous, implemented using microcode and supported by a variable-length instruction encoding. Instructions, consisting of instruction code and operand specifier units, could be a single byte or “several thousand bytes” in length. A variety of addressing modes and data types were supported in the large array of instructions and their variants.
And yet each ND-500 “processor” in any given configuration was still coupled with a 16-bit ND-100 “front-end”, this being an updated Nord-10 used for input/output and running much of the operating system, thus perpetuating the architectural relationship between the 16- and 32-bit components previously seen in the NORDIC system from several years earlier. In effect, the ND-500 still favoured computational workloads, and without the front-end unit, it could not be considered a minicomputing system in its own right.
Going Vertical
One distinct difference between the apparent strategy of Norsk Data and that of Digital, perhaps based on misconceptions of Digital’s approach or maybe founded on simple opportunism, was the way that the Norsk Data sought to be a complete, “vertically integrated” supplier in various specialised markets, whereas Digital could more accurately be described as a platform company. In one commentary I discovered while browsing, I found these pertinent remarks:
“In the old “Vertical” business model a major supplier would develop everything in house from basic silicon chips right through to … financial applications software packages. This model was clearly absurd. A company may be good at developing or providing several of the technologies and services in the value chain but it is inconceivable that any single company could be the best at doing everything.”
They originate from a representative of ICL, describing that company’s adoption of open standards and Unix as “a strategic platform”. Companies like ICL had their origins in earlier times when computer companies were almost expected to do everything for a customer, in part due to a lack of interoperability between systems, in part due to a traditional coupling of hardware and software, and in part due to a lack of expertise in information systems in the broader economy, making it a requirement for those companies to apply their proprietary technologies to the customer’s situation and to tackle each case as it came along.
Gradually, software became seen as an independent technology and product in its own right, interoperability materialised, and opportunities emerged for autonomous “third parties” in the industry. The “horizontal” model, where customers could choose and combine the technologies that were most appropriate for them, was resisted by various established companies, but in a dynamic market, they were eventually made to confront their own limitations.
In historical reviews of Norsk Data and its business, such as Tor Olav Steine’s “Fenomenet Norsk Data” (“The Norsk Data Phenomenon”), there is surprisingly little use of the word “solution” in the sense of an information technology system taken into practical use, which is a word that is used a lot in certain areas of the computer business. In those areas, like consultancy, the nature of the business may revolve entirely around the provision and augmentation of existing products to provide something a customer can use: what we call a “solution”. Such businesses simply could not exist without software and hardware platforms to deliver such solutions.
Where “solution” is used in such a way in Steine’s account, it is in the context of a company like Norsk Data choosing not to sell “solutions into specific markets” like the banking sector, identifying this a critical weakness of the company’s strategy. Certainly, a company like Norsk Data had to be adaptable and to accommodate initiatives to supply such sectors, but the mindset exhibited is that the company had to back up the salesforce with a “massive effort” to solve all of the customer’s problems. This was precisely the kind of “vertical” supplier that ICL and IBM had been out of historical necessity, entrusted with such endeavours, but also burdened by society’s continuing expectations of such companies.
Indeed, it says a great deal that IBM was the principal competition in the sector used to illustrate this alleged weakness of Norsk Data. IBM’s own crisis arrived in the early 1990s with a then-record financial loss and waves of reorganisations, somewhat decoupling the product divisions of the company from its growing services and solutions divisions, also gradually causing the company to adopt open systems and technologies. Its British counterpart in such traditional sectors, ICL, dabbled in open systems and Unix, largely keeping them away from its mainframe business, but pivoted strongly at the start of the 1990s, perhaps influenced by Fujitsu – its partner, investor and, eventually, owner – to adopt the SPARC architecture and System V Unix.
Norsk Data, however, stuck with vertical integration to its initial benefit and then its later detriment. The company had done some good business on the back of acquisitions in certain sectors – typesetting systems, computer-aided design/manufacturing – where opportunities were identified to migrate existing products to Norsk Data’s hardware and to ostensibly boost the performance that may have been lacking in the existing offerings, but the company found itself struggling to repeat such successes. In markets like the UK, it encountered indifference from software companies, who apparently perceived the company to be “too small”, and tried to invest in and cultivate smaller companies as vehicles for its technology.
Here, there may have been a possible lack of awareness or acceptance that instead of being “too small”, Norsk Data was perhaps too niche or too non-standard, in an era of emerging standards. After all, such standards increasingly defined software and hardware platforms on which other companies would build. The fixation on vertical market opportunities, having something that competitors did not, and “striking a knockout” in competitive situations, seems rather incompatible with cultivating an ecosystem around one’s products.
Another trait is apparent from discussion of the company, that being the tendency of selling in one set of products to a customer so as to be able to try and sell the customer another set of products. Thus, a customer buying one of the vertical market products might be coaxed into adopting various other strategic Norsk Data products, like the celebrated NOTIS suite of productivity applications. And with the potential for niche products to create opportunities for further sales and the proliferation of the company’s core technologies, Norsk Data got itself into trouble.
Personal Computing the Hard Way
Despite the increasing prominence of personal computing in the late 1970s and early 1980s, Norsk Data had remained largely dismissive of the trend, as many traditional vendors had also been initially. Minicomputer vendors sold multi-user machines that ran applications for each of the users, communicating output, usually character-based, to simple display terminals whose users would respond with keystrokes that would be communicated back to the “host”, thus providing an interactive computing environment. With shared storage, applications could provide a degree of collaboration absent from the average, standalone microcomputer. What exactly could a standalone microcomputer do that a terminal attached to a minicomputer could not?
Alongside this, applications that would become familiar to microcomputer users had emerged in minicomputer environments. For instance, word processing systems had demonstrated productivity benefits to organisations, providing far more flexibility and efficiency over typewriters and secretarial pools (also known as typing pools, so no, nothing to do with splashing around). Minicomputer environments could provide shared resources for still-expensive devices like printers, particularly high-end ones, and the shared storage permitted a library of materials to be accessed and curated.
From such easy triumphs in computerisation, much was anticipated from the nebulous practice of office automation. But perhaps because of the fragmented needs and demands of organisations, all-conquering big office systems could not hold off the gradual erosion of minicomputing dominance by the intrusion of microcomputers. Introduced at a relatable, personal level, one might argue that a personal computer as a simple product or commodity, along with software that was similarly packaged, may have been more obviously adaptable to some kinds of organisations, particularly small ones without strong expectations of what computer systems should do and how they might behave.
Indeed, where traditional suppliers of computers were perceived by newcomers as unapproachable or intimidating, microcomputers offered a potentially gentler introduction, as amusingly noted in one Norwegian article featuring IBM, Digital, HP and Norsk Data. For example, correspondence, documentation and other written records may have been cumbersome to prepare even with electronic typewriters – these providing only crude editing functions – and depending on the levels of enthusiasm for alternatives and frustration with the current situation, it would have been natural to acquire a personal computer with accessories, and to try out word processing and other applications to see what worked best for any given person, office, department, or organisation.
(A mixture of personal computing systems might have eventually generated interoperability problems, amongst others, but the agility that personal computers afforded organisations would potentially inform larger and more ambitious attempts to introduce technology later on.)
Personal computers began to shape user expectations of what computers of all kinds could do. Indeed, it is revealing that in treatments of the office automation market from the early 1980s, microcomputers keep recurring, and personal workstations – particularly the Xerox Star – set the tone for whatever office automation was meant to be. This was undoubtedly due to the unavoidable focus on the user interface that microcomputing and personal computing demanded. After all, personal computing cannot really be personal without considering the user!
Crucially, however, Xerox appeared to understand that one product could not be right for everyone, thus pitching a range of systems for a variety of users. The Xerox 860 focused largely on traditional word processing applications. The Xerox 8010 (or Star) was a networked workstation for sophisticated users. The company realised, particularly with IBM poised to move into personal computing, that a need existed for a more affordable product, leading to the much cheaper Xerox 820 running the established CP/M operating system. Although the Xerox 820 appears to have been considered a disappointment by commentators, who were perhaps expecting something more revolutionary, it did appear to signal that Xerox took affordable personal computing seriously, and the company was not alone in formulating such a product.
Digital tried a few different approaches to personal computing, two of which involved applications of their minicomputer architectures: the PDP-8-based DECmate, and the PDP-11-based DEC Professional. But it was their third and perhaps least proprietary approach, the DEC Rainbow, that perhaps stood the best chance of success, following a similar path to the Xerox 820, but taking the Zilog Z80-based core of such a machine and extending it with a companion Intel 8088 processor for increased versatility.
Such hybrid systems were not uncommon for a brief period at the start of the 1980s: established CP/M users would need Z80 compatibility, whereas new users and new software would have benefited from the 8088 running CP/M-86 or MS-DOS. The Rainbow was not a success, hampered by Digital’s proprietary instincts. Personally, I found it surprising to learn that the machine had a monochrome display as standard. Even with the RGB colour option, it would have rendered its own logo relatively unsatisfactorily!
What was Norsk Data’s response to the personal computing bandwagon? A telling quote can be found in an article from a 1985 newsletter:
“We do not believe in “the universal workstation” that can solve all problems for all user categories. Alternative hardware and software combinations seem to be the right answer. The functionality requirements for the personal workstations are definitely not satisfied by the “traditional PC”. For the majority of users today, the NOTIS terminal is the best alternative for a personal workstation that is integrated with the rest of the organization.”
The first two sentences seem reasonable enough, and the third could certainly have seemed reasonable at the time. But then comes the absurd, self-serving conclusion: a proprietary character terminal is the “best alternative” to something like the Xerox Star or its successor, the Xerox 6085 “Daybreak”, introduced in 1985, or other actual workstation products arriving on the market.
Evidently, decision-makers at the company remained fixated on what they considered their blockbuster products. But the personal computing trend was not about to disappear. The company’s first attempt at a product was initiated in 1983 and released in 1984, involving a rebadged IBM-PC-compatible from Columbia Computers, sold “half-heartedly” and was perhaps more influential within the company than outside.
Then, in 1986, came the product that only Norsk Data could make: the Butterfly workstation, featuring an Intel 80286 processor and running MS-DOS and contemporary Windows, but also featuring two expansion cards that implemented the ND-110 minicomputer processor to run the proprietary SINTRAN operating system. Naturally, such a workstation, with its built-in minicomputer, was intended to run the cherished NOTIS software, and a variant known as the Teamstation permitted the connection of four terminals to share in such goodness.
One can almost understand the thinking behind such products. There was an increasing clamour for approachable computing, with relatively low starting costs, and with buyers starting out with a single machine and seeing whether they liked the experience. Providing something whose experience for a single user could be expanded to cover another four users might have seemed like a reasonable idea. But to make sense to customers, those extra terminals would need to be inexpensive and offer something that another four personal computers might not, and the software involved would have to be better than the kinds of programs that ran natively on personal computers at the time. Here, the beliefs of those at Norsk Data and those of potential customers could easily have been rather different.

“Norsk Data didn’t buy Wordplex for nothing.” Norsk Data perhaps inadvertently played into 1980s stereotypes when boasting about having loads of money. Wordplex was a struggling word processing systems vendor, and the acquisition did not lead to the happy marriage of convenience – or otherwise – that was promised.
Turning Something into Nothing
Against the industry tide, it seems that the company did what came most naturally, seeking growth for its own applications amongst a captive customer audience. Thus, amidst a refinancing exercise at word processing supplier Wordplex, that turned into a takeover opportunity for Apricot Computers, Norsk Data barged in with a more valuable offer, leaving Apricot to withdraw from the contest, presumably with some relief. Wordplex, one of the success stories in an earlier phase of office automation, was struggling financially but had an enviable customer base in a market where Norsk Data had wanted a greater presence than it had previously managed to attain.
What the exact plan was for Wordplex is not entirely clear. The company had its own product roadmap, centred on its Zilog Z8000-based Series 8000 systems, initially running Wordplex’s proprietary operating system. Wordplex evidently acknowledged the emergence of Unix and sought to introduce Xenix for its systems, chosen perhaps for its continued support for the aging Z8000 architecture. Norsk Data’s contribution seems to have been to sell their own machines to “stand beside or stack vertically” on the Series 8000 machines, offering what looked suspiciously like the NOTIS suite. One could easily imagine that Wordplex’s product range was unlikely to receive much further development after that.
Commentators associated with Norsk Data seem to regard Wordplex as something of a misadventure. Steine goes as far as to accuse the Wordplex management of subverting the organisation and pursuing their own agenda, as opposed to getting on with their new duties of selling Norsk Data’s systems to those valuable customers. Yet he does seem to accept that by pushing new systems onto Wordplex’s customers, computing departments pushed back on the additional complexity these new, proprietary systems would introduce, although Steine seems to attribute such pushback more on an unwillingness to tolerate new vendors in their computer rooms.
Perhaps Wordplex’s management stuck to what they knew because they just weren’t given better tools for the job. Existing customers would want to see some continuity, even if their users would eventually see themselves migrated onto other technology. In the end, Norsk Data’s perceived opportunities never materialised, and Wordplex customers presumably saw the writing on the wall and migrated to other systems. The dedicated word processing business was being disrupted by low-cost personal computers either dressed up like word processors, in the case of the Amstrad PCW, or providing more general functionality, maybe even in a networked configuration.
It is telling that in the documentary covering the takeover, a remark is made about how Norsk Data seemed inexperienced at acquisitions and the task of integrating distinct corporate cultures, and yet the company had, in fact, acquired other companies to fuel its rapid growth. But still, it is apparent that entities like Comtec and Technovision remained distinct silos within the larger organisation. I have personal familiarity with one institutional customer of Comtec, although it may have been a faint memory by the time I interacted with its users.
That customer was CERN’s publishing section, responsible for the weekly Bulletin and other output, who had adopted the NORTEXT typesetting system with some success. In 1985, these users were gradually trying to adopt various NOTIS applications, expressing a form of cautious optimism. By 1986, with an audit of CERN’s information systems in progress, these users were facing an upgrade of NORTEXT that required terminals designed for NOTIS, as well as enhancements for NOTIS that stressed their older, 16-bit ND-100 series hardware, having been developed primarily for the newer, 32-bit, ND-500 systems.
More investment was requested to take advantage of newer hardware, increased storage, and to provide more terminals. Indeed, the introduction of ND-500 models would help to rationalise the hardware situation, reduce maintenance costs and demands, and provide better services to those users. But at the same time, amidst a “lively discussion”, the shortcomings of NOTIS were noted, that Norsk Data were “unlikely to satisfy the needs of the administration in terms of fully automated office functions such as agenda, calendars, conference scheduling”, and that better integration was needed with the growing Macintosh community inside CERN.
Indeed, the influence of the graphical user interface, and the success of the Macintosh in delivering a coherent platform for developers and users, put companies wedded to character-based terminal applications on the back foot. Graphical applications were the natural medium of such platforms, whereas companies like Norsk Data struggled to accommodate such applications within their paradigm, suggesting upgrades to more costly graphical terminals. At best, the result added around $1,000 to the cost of the terminal and merely offered an “experimental” and narrow attempt at being “Macintosh like”, in a world where potential users were more likely to opt for the real thing instead.
Despite the mythology around the Mac, the platform was, like many others, still finding its feet and lacking numerous desirable capabilities. The mid-1980s was a fluid era for the graphical personal computer, and although a similar mythology developed around the Amiga, which was more capable than the Mac in several respects, success for a platform demanded a combination of technology, applications, the convenience of making such applications, and a demand for them.
On the dominant IBM-compatible platform, it took a while for a dominant graphical layer to assert itself, leaving observers attempting to track the winner from candidates such as VisiOn, GEM, Windows, OS/2 and NewWave. It is perhaps unsurprising that Norsk Data had no ready answer, and that even as it introduced its own personal computers running DOS and early Windows software, it was merely waiting for an industry consensus to shake out. Other strategies could have been followed, however: vendors in Norsk Data’s situation chose to enter the workstation market, which is a topic to be considered in its own right.
Another company that struggled with personal computing was ICL. Having acquired some interesting products, such as those made by Singer Business Machines – a division of the Singer Corporation, perhaps most famous for its sewing machines – the same division, then operating as part as ICL, made a system that formed the basis of ICL’s Distributed Resource System product family. In the initial DRS 20 range, computers with 8085 processors running CP/M would run applications and access other machines acting as file servers over ICL’s proprietary Macrolan network.
Such solutions were not always well received by the personal computing media. Expectations that ICL would bring its market position to bear on the rapidly developing industry led to disappointment when the company introduced the first DRS models, drawing suggestions that the diskless “workstations” would make rather competitive personal computers, if only ICL were to remove the “nearly £1000” network card and replace it with a disk controller. Later models would upgrade the processor to the 8086 family and run Concurrent DOS. Low-end models did indeed get disk drives, but did not break out into the standalone personal computer market.
Instead, ICL also decided to sell a different set of products, licensed from a company called Rair, as its own Personal Computer series, and these even utilised similar technologies to its initial DRS line-up, such as the 8085, CP/M, and MP/M, but offered eight serial ports for connected terminals instead of network connectivity. Rair’s rise to prominence perhaps occurred through the introduction of the Rair Black Box, to which a terminal had to be attached in order to use the system. A repackaged version formed the basis of the first ICL PC.
ICL appear to have been rather more agile than Norsk Data at introducing upgrades to their PC and DRS families. The PC range evolved to include models like the exotic-sounding Quattro, still trying to cater to office environments wanting to serve applications to terminals in a relatively inexpensive way that was, nevertheless, seen as less than persuasive in an era where personal computing had now established itself. Eventually, ICL reconciled itself to producing IBM-compatible PCs. In the early 1990s, I encountered some of these in a brief school-era “work shadowing” stay at a municipal computing department which predictably operated an ICL mainframe and some serious Xerox printing hardware.
Meanwhile, the DRS range gained a colour graphical workstation running the GEM desktop software on Concurrent DOS. GEM was a viable product adopted by a variety of companies including Atari, Amstrad and Acorn, despite Apple attempting to assert ownership of various aspects of the desktop paradigm. It would have been interesting to see Apple try and shake ICL down over such claims, given all the prior art in ICL’s PERQ that helped sink Apple’s litigation against Microsoft and Hewlett-Packard later in the decade. But it is how part of the DRS range evolved that perhaps illustrates how the likes of Norsk Data might have acted more decisively.













