UNIX

From iGeek
Revision as of 17:34, 4 August 2019 by Ari (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

I'm both a big UNIX fan, and one of its detractors. UNIX is the old war-bird of Operating Systems -- which is ironic since it really isn't an Operating System any more -- but more on that later. UNIX was created as a private research project by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. Since AT&T wasn't really in the computer business they had the wise marketing plan of giving away the source code to UNIX for free. UNIX wallowed around and was basically only popular in education and research labs because it was inferior to other commercial OS's of the time. But since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it, and researchers came from academia so they used it too. This legacy has totally defined what UNIX is, and what it is good for -- and bad at. It wasn't good, it was free. But perfect is the enemy of good enough, and UNIX was always "Good enough" that people used it, added to it, and it sort of became a defecto solution until it got near universal adoption. These same pragmatic compromises are similar to why TCP/IP and HTML became the Internet.


UNIX : 7 items


Command Line Interface -
Sandwich.png
The other day, a friend and I were discussing the command-lines versus a GUI. His point, which I've heard a thousand times before and for 20+ years, is that unless you understand what's going on in the command line, then you don't understand what's going on in the computer. The idea being that the GUI abstracts you from what's really happening, and that some people can make command lines really fly; so they must be better overall. There's really a lot of different arguments and bias in there; so I have to break them down.

How Secure are you? -
HowSecure.jpg
How secure are your devices from intruders? The answer is "it depends", on a lot of things, like what machine you have, what you do, and so on. The short answer is in order of safety (from least secure to most), you'd go: Windows, Unix, Mac, Android/Chrome, IOS. And the OS's are more secure than the Apps you run -- so iOS running only Apps from their Store is going to be a lot safer than a machine that's running software downloaded at random from the Internet.

Linux - I consider LINUX as a flavor of UNIX. Really Linux is this freeware version of UNIX, that works basically the same, but was written later and was more a project in reverse engineering (copying) an Operating System. Some design parts are better, but some aren't. Really, it is really more accurate to think of LINUX as a UNIX clone than as UNIX. However, since all UNIXEN (including LINUX) have more in common than they have different, let's just pretend that everything that looks, smells and behaves like UNIX is UNIX. In the world of UNIX this assumption is often enough to cause fits and endless debate -- so be careful that you don't use the term UNIX as casually as I do.

MacOS X is Unix - OS X is cool because UNIX is cool. Not because UNIX is a particularly well done OS, UNIX has tons of anachronistic design choices and has plenty of legacy issues that aren't pretty or modern. But UNIX does have many strengths to counteract those issues. The Mac was not UNIX, Jaguar is UNIX.
  • 👍The biggest strength of UNIX that virtually all college educated developers learn on UNIX. That gives it a huge domain base and size of market especially in App and OS development, Scientific areas and network administration. UNIX is the test-bed for most of these technologies. If Apple wants to move a new OS forward, they are going to be borrowing from UNIX anyway, so why not make their OS UNIX derived?
  • 👍 There is also openness about code (open source), and API's and design that has permeated UNIX. (College Marxists meet code). Many have contributed to UNIX, and still do. When you give it away for free, others use it. This has snowballed for many decades. UNIX evolves, and you can stay closer to the cutting edge if you don't have to port everything to a different platform.
  • 👍UNIX is also very stable, in almost all senses of the word. It can be a bitch to setup, and maintain but Apple is hiding much of that. However, once setup and configured properly, it will work, and can survive bad software practices (like poor QA) pretty well. You can set things up and just leave them, and know that when you come back, they'll probably still be running. Better than Windows, far better than Mac, and nearly as good as IBM mainframe type solutions. This stuff is robust. On a desktop and in many other markets, this is going to make a significant difference.
  • 👎However, when I mentioned stability I also meant it as in "not changing". While little things evolve in UNIX, and change constantly, it has been UNIX for 30+ years (arguably 40). In many ways it takes a lot less time to change from versions of UNIX than it does between versions of Windows. That stability is incredibly comforting to many programmers, network administrators, academics, researchers and just plain users. They want to know their machines; and sometimes they have had a longer relationship with them than anything or anyone else. Learning a new UNIX takes them a few hours, days or weeks, depending on level of intimacy they want, but they then know all sorts of things in incredible detail. UNIX plays to human nature and the dislike of change, and the thirst to know, and the drive to have control over one's destiny and environment. UNIX does that better than any other OS out there. OS X can ride on that. And users know that whether OS X lives or dies, most of the knowledge they gain learning about UNIX can be applied in the future. That's emotional stability for people, and they love the platform that provides that.
  • 👎Which brings up the main issue: Jaguar is still a UNIX (and not a Mac). It is the best UNIX I've ever used (and I've used bunches) but I can't just upgrade blindly and expect things to work like a Mac. Many things require special versions of Apps or tools to run on Jaguar, many apps die after the upgrade, and the OS itself has issues after an upgrade. And don't move things around on OS X; I did that and confused the shit out of the OS. It seems that many Applications are not mine to control but rather the OS's. Same with naming. I have bunches of things that lost their bundles, or where they were supposed to be. There are whole hidden hierarchies and voodoo; deep paths that I don't have control over. Some UNIX types will blame this on me but they're missing the point; this is my OS, not theirs. It should behave like I own it, not that I'm beholden to the ivory tower and Apple to decide where things go, what they can be called, and what I will do. So one of the first things I notice that I've lost is trust that I can change things and it will still work, and faith in the robustness in the OS (as far as upgrading and not having things change). OS X (including Jaguar) is far more fragile towards change than the MacOS used to be.

    Don't get me wrong, the pre-X MacOS had plenty of quirks or bad versions that would screw things up too. But users could move anything (except for a little caution in the System Folder). There was a 1:1 mapping between what I saw in the Finder and what was there. I could drag-install and uninstall. I had trust that it was hard to break things and easy to fix them if I did. Jaguar isn't like that, like UNIX it is based on, it is configuration fragile. For newbies who don't muck with much, it's fine. Mac Power users will be frustrated by not being able to configure and tweak. But UNIX folks can't fathom why you would want to move something, or not spend days fixing problems if you do. And Jaguar isn't worse than Windows in that regard. So none of that is the end of the world, but I think it will be a while before we can get that stability and trust back. Or people just adapt to the new reality that Apple doesn't want to give you that much control.

Stupid Knowledge -
StupidKnowledge.jpg
In computers (as well as other aspects of life), there are things you need to know, but shouldn't have to. This isn't useful knowledge, because if it was done "right", you wouldn't need to know it at all. This is stuff you need to know, because humanity is too lazy to fix it and do it right. Thus I coined the term "Stupid Knowledge™", for things you have to know to get by (it is knowledge), but you shouldn't have to (it's stupid that you need to know this).

UNIX History - UNIX is the old war-bird of Operating Systems that was created as a private research projects by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. The "interface" (and I use that term loosely) was "borrowed" liberally from Honeywell's Multics (which comes from MULTiplexed Information Computing System). The Bell guys borrowed the name and acronym by creating UNiplexed Information and Computing System, which was originally spelled UNICS -- but now we spell it UNIX. Two things can be learned from this:
  1. Uniplexed would imply "one" to the multiplexed's many -- and thus imply that it was a single user system instead of a Multi-user Operating System. Nothing could be further from the truth -- almost every design decision has been made to make UNIX a multiplexed (time sliced) multi-user operating system.
  2. With UNIX the geeks and programmers have always been in charge (with good and bad results). Programmers are not particularly good with (nor concerned about) spelling -- and these guys wanted something shorter, harder to spell and more cryptic than an acronym that made sense. So instead they spelled UNICS with an X -- thus starting a trend of bad spelling and unintuitive abbreviations that is rampant throughout every command in UNIX.

So the take-away from both of these things is, you can't trust anything in UNIX. It's all about arcane knowledge of it's history, and not what would really make sense if you were doing it "right" today.

UNIX is a foundation - Is UNIX an OS? Not really any more, it's the foundation of one. When Operating Systems (OS's) were created, they were basically:
  1. A Kernel: (a) messaging (routines so Apps or parts of them could talk to each other) (b) scheduling (so that each App got a fair amount of time, while running) and (c) memory management (so one app couldn't crap on another app, intentionally or by accident/bug).
  2. Device Drivers: little mini-Apps that helped the OS or Applications talk to other parts of the hardware, like the keyboard, screen, network, and so on.
  3. Shell (Command Line): a simple way to issue semi-English like commands to tell the OS what Apps to run, or how to manage files.

In 1969 that was all that sat between the Application Developer and the Hardware, and between both of those and the User. By the 1980's the problem was that computers were not just for the geek-elite (coders and systems managers), now you had "users" that wanted to use computers to do things, not just to program them to do things -- so what the OS was (what sat between the User and everything else), had grown a lot. Many utilities to handle Graphics or Files, things that could handle streams of data like Video or Music, many Applications that are considered necessary utilities (like Browsers, and email). Purists argue that those aren't the OS -- but virtually no one would buy a commercial OS without those things. If you replaced UNIX, and if >50% of your users wouldn't know or care, then that's not what the users consider the OS any more. So UNIX is no longer an OS, it's more an OS toolkit: it's just the foundation that a full featured OS is built on top of. But there a lot of old-schoolers that still believe in what they were taught in the 1970's, or their teachers from the 1970's have taught them.