UNIX History

From iGeek
Jump to: navigation, search

UNIX is the old war-bird of Operating Systems that was created as a private research projects by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. The "interface" (and I use that term loosely) was "borrowed" liberally from Honeywell's Multics (which comes from MULTiplexed Information Computing System). The Bell guys borrowed the name and acronym by creating UNiplexed Information and Computing System, which was originally spelled UNICS -- but now we spell it UNIX. Two things can be learned from this:

  1. Uniplexed would imply "one" to the multiplexed's many -- and thus imply that it was a single user system instead of a Multi-user Operating System. Nothing could be further from the truth -- almost every design decision has been made to make UNIX a multiplexed (time sliced) multi-user operating system.
  2. With UNIX the geeks and programmers have always been in charge (with good and bad results). Programmers are not particularly good with (nor concerned about) spelling -- and these guys wanted something shorter, harder to spell and more cryptic than an acronym that made sense. So instead they spelled UNICS with an X -- thus starting a trend of bad spelling and unintuitive abbreviations that is rampant throughout every command in UNIX.

So the take-away from both of these things is, you can't trust anything in UNIX. It's all about arcane knowledge of it's history, and not what would really make sense if you were doing it "right" today.

Scope creep

UNIX was meant to be a simple OS (Operating System), designed for computers of that era (a DEC PDP-7 and later a PDPD-11), it was modular (reused code and maximize programmers efforts), it evolved to include the requirement to use a high level language of the day (C) to allow portability (to run on many platforms) and to be easier to develop and maintain. But what was "high level" back in its day, is considered very low level today. While a UNIX can be a very stable Operating System once configured, you often have to be a geek/coder to get things (like new Applications) to actually work. Which is part of why it is favored for servers... or other people pre-package everything you need to actually get it working in "distros" (distributions).

Failing its way to success

UNIX wallowed around and was basically only popular in education and research labs. The commercial Operating Systems of the time (70's and early 80's) like MVS for IBM, VMS for DEC, AOS/VS for DataGeneral, and so on, were all more powerful, better tested, easier to use, better documented, had more Applications and all the things that made an Operating System superior. But AT&T wasn't in the computer business (early on) -- and so they had the wise marketing plan of giving away the source code to UNIX for free. They retained the trademarked name UNIX for themselves (and you can see how well that has worked out), but the code was free to futz with. Interesting marketing plan -- they wouldn't make any money on the OS, but they'd make it up in volume.

Surprisingly the plan worked. Since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it. Researchers were often accademics (who are usually slow to change) and so research labs started using UNIX as well. This legacy has totally defined what UNIX is, and what it is good for -- and bad at.

UNIX quickly became a test-bed and demonstrator for reasearch projects in software engineering (grad-work) in many Universities -- and by the early late 70's there was soon thousands of different versions of UNIX, each with their own features and hacks, and most going by some annoying acronym like Microsoft's own Xenix, ULTRIX, IRIX and Apple's own AU/X (though some of these particular examples came later). Basically any Operating System which ends in an 'X' is probably a UNIX derivative. But because there were all these branches and projects done on some version of UNIX, if you didn't mind getting dirty (doing some coding/fixing), you could get your UNIX to do just about anything.

All these flavors made things confusing and somewhat incompatible -- they needed to be brought together and standardized (more so).

  • The University of California Berkeley is really responsible for much of this consolidation effort and the foundation technologies that we know as UNIX today. Their distribution was known as BSD (for "Berkeley Software Distribution") is still a term recognized today. Berkley added Virtual Memory, a better filing system, TCP/IP networking, and even the freeware model.
  • But many other Universities contributed to UNIX as well, including Massachessets Institute of Technology (MIT) and their X-Window System.

Again, the key here is that UNIX was a testbed for bolting on technologies and using as a foundation for other things. There was also some commercial Unicies, but most of the control and direction was done by Universities.


UNIX sort of fell into commercial popularity by accident. As Operating Systems became bigger (and had to do more things), they became more and more expensive to make, since there was more and more costs and complexity to making one. Most of it wasn't hard, it was just big. There was so much to building a new OS, that all the little computer companies wanted to stop reinventing the wheel. They used UNIX as the foundation of their OS, and then built all the other stuff on top of it because it was a easy, cheap, start.... and because most of the people they could hire had UNIX experience somewhere in their background.

A perfect example of this is Sun. Sun was founded by ex-Stanford and Berkely Grad students (including Bill Joy) to meld Standford Hardware (some of the earliest RISC machines which became the Sun SPARC and UltraSPARC processors) and Berkely Software (UNIX which later became SunOS/ Solaris), in order to make cutting edge workstations starting in 1982. Others, like MIPS and Silicon Graphics followed suit. By borrowing the foundation of their Operating System from UNIX, Sun could focus their development on making solutions for customers -- Applications and Utilities that they needed and building on top of UNIX, and not spend all their time and R&D budget on making yet another Operating System. It was not that UNIX was a particularly good Operating System -- but it wasn't bad either. UNIX did enough well that it was an excellent foundation... and it was free'ish.

Borrowing UNIX didn't just work in favor of the commercial companies, it also helped UNIX. Sun's commercial motivations helped make UNIX a much more stable Operating System, and created NFS (Network Filing System) -- and UNIX'es popularity meant more Apps coule be easily moved to Sun platforms.

The same thing happened for all the other companies that used UNIX as their foundation. Communications and features moved both ways. Big companies or older companies (with legacy and designs) could afford to do some things better than UNIX on their own -- but almost all the smaller and startup computer companies found it easier to just leverage off what was there. So UNIX was used more and more by the smaller companies in the industry -- but all of them together, and all the college work and momentum began to add up.

Another key reason for this cross platform adoption was that UNIX had been written in a high level language of it's day (C), and used very little assembly language. Assembly Language is very specific to each processor, and so has to rewritten (or ported) to each new machine (processor) that you run it on. By using C, programmers just had to make a C compiler for their processor, and most of the OS (UNIX) would come over easily. Then all the Applications and Utilities that were written for UNIX could be ported over pretty easily as well. This process wasn't painless, by any stretch of the imagination. but the portability of C, and using UNIX made the process of getting applications to your platform a lot easier than it was without them.

Over time, as more and more companies looked at the daunting task of creating their own Operating Systems, and building all the Applications and Utilities that would make it attractive to users -- more and more just went with UNIX instead. It gave them an easy head-start into acceptance, and let them all focus on some core competancy and some way to add value to their solution. The more this was done, the more popular it became -- and the more Apps and software was written for UNIX.

🗒️ NOTE:
This whole process is not too disimilar to why people used DOS or why Windows was adopted over superior technologies like OS/2 or Macintosh. If you were a hardware company then it was much easier to just make something that used a popular OS, and let all the Apps come with it.


In some ways UNIX has not changed at all and is still the anachronistic monster it was back when computers used to fill a room. In other ways, UNIX has always been on the cutting edge. I blame both of these on the fact that it has been the academic and institutional standard for so long. If you've ever worked with or for a college you understand this dicotamy -- a big huge slow-changing beurocracy, filled with beurocrats and some serious smart people, all playing in their fiefdoms and infighting for money or power, wrapped around either the most useless or completely important research projects. It is a wierd mix. This may or not may be better than the commercial (real) world -- but it is a totally different model, with interesting results.

After a while, nobody bothered to innovate at the lower level any more... at least not as fundamentally as they used to. Everything was about what you could retrofit into a UNIX style architecture, instead of what are the right ways to do things. So now all the big commercial Operating Systems use UNIX. Apple caved and went to UNIX, after it's own efforts to create a better foundation died out. (Mostly because they acquired NeXT, and NeXT had used UNIX as it's foundation, because it couldn't compete with the likes of Apple/Microsoft/IBM any other way. IBM, Microsoft and circumstance had successfully choked all the other Operating Systems out of business. Then finally, they caved to support UNIX as well. So UNIX won not by superiority but by sloth and determination.


An example I use is called piping and structured versus unstructured data. In the modern world, you pass messages using structured data -- a file can have a small thumbnail image of the file, the high-resolution version, it can have the text and images that compose it, metadata about when it was created and by who, even multiple language versions of the same text, and so on. All of this is in a structure -- and when you share that with another App/Service, it looks to see if it has the data in the format it needs, and pulls out those parts. Very useful and modular. In UNIX you do things like send or "pipe" data from one script (or app) to another. Just a blob of raw shit in one format (unstructured). The other side understands exactly that format, or it doesn't work. Two apps have to know a lot of details about how they are going to communicate. Why does UNIX do it that inferior way? Because in the 60's the idea of data structures were still relatively new, and certainly not standardized, and it was easier to do. So nobody has bothered to fix this, as it is too fundamental to what UNIX-heads expects to do better/right.

I'm both a big UNIX fan, and one of its detractors. UNIX is the old war-bird of Operating Systems -- which is ironic since it really isn't an Operating System any more -- but more on that later. UNIX was created as a private research project by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. Since AT&T wasn't really in the computer business they had the wise marketing plan of giving away the source code to UNIX for free. UNIX wallowed around and was basically only popular in education and research labs because it was inferior to other commercial OS's of the time. But since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it, and researchers came from academia so they used it too. This legacy has totally defined what UNIX is, and what it is good for -- and bad at. It wasn't good, it was free. But perfect is the enemy of good enough, and UNIX was always "Good enough" that people used it, added to it, and it sort of became a defecto solution until it got near universal adoption. These same pragmatic compromises are similar to why TCP/IP and HTML became the Internet. more...

Written 1999.10.24 Edited: 2019.08.04