What is Anti-Aliasing? It is using color to increase the percieved resolution on a display. It can't really alter resolution, but it can appear to do it, to the eye.
Because I was a programmer, many people ask me, "How do I get started programming?" There are many choices, and it really depends on what you are trying to do. There is programming Applications, scripting, Web Programming, and so on. The bad news is that each of those choices will alter which language or tools you should choose -- and most people don't know this in advance. The good news is that when you get the concepts, many of them can follow from language to language and tool to tool. So the most important thing is to just show no fear, dive in, and start learning; some knowledge will be throw-away, but most you'll carry with you for years to come.
What is Endian? How do you like your eggs? Big or little end up? If there are two equally valid ways to do something, then odds are that two different companies will chose to do those things differently. This is Murphy's law in action -- and it applied to different chip designers and how they ordered data in memory.
Counting in Computerese: The Magic of Binary, Octal and Hexadecimal. Computers deal in the mystical numbering systems, like Hexadecimal, Octal and Binary. People get concerned over it sounding complex, but they are really quite simple. If you can read this article, you should have a really good understanding of what they are, and how they work.
Enterprise, Opensource or Commercial tools, which is better and why? Of course the answer is, "it depends": different tools are better for different things. Now I know that doesn't sound revolutionary, but that does seem to perplex some people. People don't understand the different tools or market segments they fit into, or what they are good for.
There is a computer term that you hear some geeks and industry insiders use, but many people new to computers don't know, but should. That term is FUD. FUD means "Fear, Uncertainty and Doubt", and it was the tool of large companies to scare users from using small companies software (or hardware). They'd so uncertainty, so customers would buy from the safest (largest) company, even if it wasn't currently the best software, or scare them into buying the biggest program, over features they might someday need (but only added complexity today).
There's a joke amongst programmers, usually towards their management, "if you don't leave me alone, I'll replace you with a very small shell script". The idea is that they're so simplistic and repetitive that a few lines of code could do their job: now go away.
I worked over a decade as a consultant, and used and managed them for a couple decades more. As a consultant I've worked for organizations (agencies) and as an independent. I have nothing against consultants or consulting (they're a very valuable resource), but there is an art to using consultants or consulting organizations wisely, and most companies don't quite have the artistry required. This article will try to explain some of the pitfalls and ways to use consultants better.
I'm both a big UNIX fan, and one of its detractors. UNIX is the old war-bird of Operating Systems -- which is ironic since it really isn't an Operating System any more -- but more on that later. UNIX was created as a private research project by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. Since AT&T wasn't really in the computer business they had the wise marketing plan of giving away the source code to UNIX for free. UNIX wallowed around and was basically only popular in education and research labs because it was inferior to other commercial OS's of the time. But since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it, and researchers came from academia so they used it too. This legacy has totally defined what UNIX is, and what it is good for -- and bad at. It wasn't good, it was free. But perfect is the enemy of good enough, and UNIX was always "Good enough" that people used it, added to it, and it sort of became a defecto solution until it got near universal adoption. These same pragmatic compromises are similar to why TCP/IP and HTML became the Internet.