Command Line Interface

From iGeek
Jump to: navigation, search
Sandwich.png

The other day, a friend and I were discussing the command-lines versus a GUI. His point, which I've heard a thousand times before and for 20+ years, is that unless you understand what's going on in the command line, then you don't understand what's going on in the computer. The idea being that the GUI abstracts you from what's really happening, and that some people can make command lines really fly; so they must be better overall. There's really a lot of different arguments and bias in there; so I have to break them down.

What is a command line and GUI?

A GUI is a graphical user interface. The user interacts with menus, widgets (controls), and windows to make things happen. Things are displayed from a graphics mode, instead of the older way, which was just displaying everything as characters and text; and so it picked up the 'G' in GUI.

A command line, on the other hand, is not a user interface at all. Users may interact with it, but that's coincidental. What a command line is, is just programming interface or language, where you write little file based computer programs, through a really lame editor, one line at a time. Then those commands get interpreted and executed as you type them.

There's a big difference between those two. In the GUI it tries to do things like present your choices, interpret user behavior, and make the users life easier. The CLI assumes that everyone is a programmer, who will memorize all possible commands, and variants and variables/attributes for each commands, and then run those little programs to make the computer do what they want. This is fine if you're a programmer and think like a programmer, but kinda sucks if you're a user that just wants to USE the computer.

Lower is better

An implied argument is that that lower level is better. Command lines are sometimes underneath or below GUI's. So what is happening is that GUI applications are actually sending commands to the command line, or interacting roughly at that level, so by understanding the command line, you're understanding the root of the problem; thus know more about why things may or may not be happening.

There are a couple problems with this though.

First, not all Operating Systems are designed this way. UNIX and Windows were designed that way; with a CLI underneath a Graphical Interface layer, but the Macintosh was not. UNIX and Windows didn't do this because it was the best way to design a GUI; they did it because it was easier and they had to fix less if they just hacked and bolted stuff onto what they already had.

So assuming that the CLI is fundamentally a lower level, shows naiveté about true design and is just reflecting a bias of the systems that they are familiar with. It doesn't have to be like that; it just happens to be in the platform that they know. You could build a command line and GUI on the same lower level system, or have no CLI at all, and build a higher level scripting engine like Apple tried with AppleScript. There are many ways to solve a problem.

Another issue is that if lower is better, then why stop at the command line? Why should I stop at the stupid silly single line editor and lame command line language (shell script)? Why don't I go lower level and use a better language like Perl, PHP, Python, or Java? And if lower level is better, then use C or PL/M or a lower level language? Or why not lower still, and use assembler language? And why stop there? Why not code in the lower level machine code (binary or hexadecimal representation that the computer understands). And why stop there? Real men code in microcode and use logic analyzers and oscilloscopes; that's were the true understanding of the computer comes into play. Until you understand propagation delay and transistor settle time issues, then you're not a real computer user.

Of course that argument is silly. We built layers on layers on layers, so that most people wouldn't have to know all that stuff. Each layer you climb up is that much easier to learn, and allows for users have to know less and do more with a single action, thus making things easier to learn and faster to use. Arguing that you should know how to use a command line to use a computer, is the exact same as arguing that you should need to know how to manually adjust the timing in your car, and set the air-fuel mixture yourself to adjust for altitude and temperature in your car, before you are a true driver. Silly me, I thought they made cars smarter so I wouldn't have to know that stuff. I also think there's a difference between being a driver and a mechanic; or between a user, and programmer, IT person or solutions creator.

Who are your users

I think the point that many people miss, is who are your users?

If your users are network administrators, then they need to know more than the average user so that they can better diagnose problems, and even write some simple scripts to make the computer do what they want or automate some behaviors.

If your users are software engineers or programmers, then they need to know even more than network administrators on many things, so they can write more sophisticated programs to make the computer do even more.

If your users are actually systems programmers or device driver writers, then they need to write code at an even lower level, and in many ways know even more about some parts of the system.

But if your users are users, then they damn well shouldn't have to know how to write a program to use a computer. Not even simple one-line programs like a command line.

What are you trying to do?

If there is a good interface to a program, then you will never have to get "underneath" the program in order to fix it, to make it do what it isn't doing for you. It is only when a program fails to work in the way that I want it to, that I have to get underneath the simple and cleanly presented GUI (and attributes/configuration options), to get to some ugly text file, or scripted commands to fix whatever is wrong with it.

This means that if I have to go to the command line or shell scripting to make something do what I want, then probably they have already failed to write a good program. Thus they've demonstrated not competence but incompetence. This would be the same as me having to manually set the choke or timing in my car - I don't have to do that any more, because this isn't the 60s, and long ago, designers perfected the automatic choke and so the car does what I want. And for good programs, I don't have to fix or tweak the config files to get it to do what I want.

Now sure, the geeks feel cool when they can make that program do what normal users can't; but there's a different between feeling cool and being productive and making money, or getting work done, or in having a high quality product.

Perfect world

In a perfect world, we'd have nice clean API's (Application Programming Interfaces) written and documented that would be used by GUI's and CLI's. Command lines wouldn't be little lame one-line tools, they'd instead be written on top of the GUI to allow for menus, syntax coloring, full screen editing (not single line), multiple command lines (programs) to each be running in their own window, and to interact with each other at a much better level than just text piping (instead true object encoding), and so on. CLI's wouldn't even be CLI's; there would be a set of complex plug-in's that would allow people to expand and extend programs that goes way beyond scripting events - but even event scripting and data encoding would be much better. We'd have all events passed though a common event methodology, and data would be encoded in it's native form which would be a lot richer than just text. (You could send the output of one object as a picture, or sound, it would be a picture or sound, and not just a stream of garbage). And so on.

Apple came closer than anyone else to doing this perfect world thing. They tried most of these things with AppleScript; with various degrees of success. And Microsoft tried to borrow it with VBA (Visual Basic for Applications, COM, ActiveX and some other borrowed technologies/ideas). But both have not really fully succeeded. Apple had a better design in theory, but fell short on implementation. Some of Microsoft's technologies had better implementation, but lame designs. And in the end, neither has quite succeeded to the level they could have. And UNIX variants were definitely 1960's designed CLI's, with 30 years of cruft added on top, one grad project at a time, without anyone actually fixing underlying architectural problems or compromises.

Also in this non-prefect world, we have to live with what has come before. People don't like to learn anything new, so they just parrot and stick with older and more primitive solutions, rather than true growth. So we've artificially increased the cost of change. Thus rather than getting best case solutions, we stick with mediocre or "good enough". But that is the world we live in. And UNIX is older, and has the most legacy, and seems to have a user base that is the most resistant to change. So while UNIXes piping and scripting are often lame and outdated compared to what we could do (if we first wiped out the bias and assumptions behind those implementations), in practice it works pretty easy and is in fact good enough; and probably better than any alternatives to date.

Command line fu.png

Written 2002.11.29 Edited: 2018.04.11, 2018.04.15

Tech | Programming : Anti-aliasingBasics of BASICBig or Little EndianBinary, OCTal, HEXadecimalDatabasesDigitized SoundEnterprise ToolsFUDForward CompatibilityFree FeaturesHack, Crack or PhreakHiring ProgrammersHistory of Visual BasicHow does compression work?MHz or GHzRISC or CISCRaster ImagesSoftware ConsultantsSoftware Development Live CycleSynthesized SoundUNIXWhat is MP3?What is a WebApp?Why is software so buggy?