UNIX is a foundation

From iGeek
Jump to: navigation, search

Is UNIX an OS? Not really any more, it's the foundation of one. When Operating Systems (OS's) were created, they were basically:

  1. A Kernel: (a) messaging (routines so Apps or parts of them could talk to each other) (b) scheduling (so that each App got a fair amount of time, while running) and (c) memory management (so one app couldn't crap on another app, intentionally or by accident/bug).
  2. Device Drivers: little mini-Apps that helped the OS or Applications talk to other parts of the hardware, like the keyboard, screen, network, and so on.
  3. Shell (Command Line): a simple way to issue semi-English like commands to tell the OS what Apps to run, or how to manage files.

In 1969 that was all that sat between the Application Developer and the Hardware, and between both of those and the User.

By the 1980's the problem was that computers were not just for the geek-elite (coders and systems managers), now you had "users" that wanted to use computers to do things, not just to program them to do things -- so what the OS was (what sat between the User and everything else), had grown a lot. Many utilities to handle Graphics or Files, things that could handle streams of data like Video or Music, many Applications that are considered necessary utilities (like Browsers, and email). Purists argue that those aren't the OS -- but virtually no one would buy a commercial OS without those things. If you replaced UNIX, and if >50% of your users wouldn't know or care, then that's not what the users consider the OS any more. So UNIX is no longer an OS, it's more an OS toolkit: it's just the foundation that a full featured OS is built on top of. But there a lot of old-schoolers that still believe in what they were taught in the 1970's, or their teachers from the 1970's have taught them.


UNIX

I'm both a big UNIX fan, and one of its detractors. UNIX is the old war-bird of Operating Systems -- which is ironic since it really isn't an Operating System any more -- but more on that later. UNIX was created as a private research project by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. Since AT&T wasn't really in the computer business they had the wise marketing plan of giving away the source code to UNIX for free. UNIX wallowed around and was basically only popular in education and research labs because it was inferior to other commercial OS's of the time. But since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it, and researchers came from academia so they used it too. This legacy has totally defined what UNIX is, and what it is good for -- and bad at. It wasn't good, it was free. But perfect is the enemy of good enough, and UNIX was always "Good enough" that people used it, added to it, and it sort of became a defecto solution until it got near universal adoption. These same pragmatic compromises are similar to why TCP/IP and HTML became the Internet. more...

Written 1999.10.24