Programming

From iGeek
Jump to: navigation, search



WhatsAntiAliasing.jpeg
WhatsAntiAliasing.jpeg
Anti-aliasing is a technique in computers for sacrificing clarity/contrast to give the illusion of better screen resolution.

What is Anti-Aliasing? It is using color to increase the percieved resolution on a display. It can't really alter resolution, but it can appear to do it, to the eye.

BASIC.png

Because I was a programmer, many people ask me, "How do I get started programming?" There are many choices, and it really depends on what you are trying to do. There is programming Applications, scripting, Web Programming, and so on. The bad news is that each of those choices will alter which language or tools you should choose -- and most people don't know this in advance. The good news is that when you get the concepts, many of them can follow from language to language and tool to tool. So the most important thing is to just show no fear, dive in, and start learning; some knowledge will be throw-away, but most you'll carry with you for years to come.

Gulliverstravels.jpg

What is Endian? How do you like your eggs? Big or little end up? If there are two equally valid ways to do something, then odds are that two different companies will chose to do those things differently. This is Murphy's law in action -- and it applied to different chip designers and how they ordered data in memory.

CountingComputerese.png

Counting in Computerese: The Magic of Binary, Octal and Hexadecimal. Computers deal in the mystical numbering systems, like Hexadecimal, Octal and Binary. People get concerned over it sounding complex, but they are really quite simple. If you can read this article, you should have a really good understanding of what they are, and how they work.

Sandwich.png
The other day, a friend and I were discussing the command-lines versus a GUI. His point, which I've heard a thousand times before and for 20+ years, is that unless you understand what's going on in the command line, then you don't understand what's going on in the computer. The idea being that the GUI abstracts you from what's really happening, and that some people can make command lines really fly; so they must be better overall. There's really a lot of different arguments and bias in there; so I have to break them down.
Enterprise.jpg

Enterprise, Opensource or Commercial tools, which is better and why? Of course the answer is, "it depends": different tools are better for different things. Now I know that doesn't sound revolutionary, but that does seem to perplex some people. People don't understand the different tools or market segments they fit into, or what they are good for.

FUD.png

There is a computer term that you hear some geeks and industry insiders use, but many people new to computers don't know, but should. That term is FUD. FUD means "Fear, Uncertainty and Doubt", and it was the tool of large companies to scare users from using small companies software (or hardware). They'd so uncertainty, so customers would buy from the safest (largest) company, even if it wasn't currently the best software, or scare them into buying the biggest program, over features they might someday need (but only added complexity today).

ForwardCompatibility.png
In the tech world, I hear all the time about "Backwards" compatibility. That to me is like saying "I wish my hot new CD player would play my 8-track tapes as well". Backwards compatibility is when you create a new function or feature for your computer, but must also have a mode that works just like it did in the past. But this article is about learning to look forward,
HackCrackPhreak.png
What are hackers, crackers and phreaks? This is the basics of how the terms evolved. They don't really mean what they meant when they were first used. So people date themselves with how they use the terms.
Hiring-programmers.jpg
Human Resources people, Managers, and general users, have no idea how simple or complex computer programming is. They think that they can just throw programmers around from one task to another, then some HR people select computer programmers based on language (Syntax), and not what really matters (skills and abilities). This would be like hiring an employee based on what school they attended and not what subjects they studied! This article will give some non-programmers a better idea of what Programming is about, and what they should be looking for when hiring programmers.
Hypercard.png
The History of Visual Basic is a bit of a history of early computers and Microsoft, how they borrowed other people's ideas and even implementation and then took credit.
Frequency.png
MHz or GHz, what does it mean? Many people ask questions about Megahertz (MHz) or Gigahertz (GHz). Worse, some people don't ask, and assume they know what those terms mean. It isn't hard to understand (it's just clock speed) - but it isn't as easy as some assume either. Many people get it wrong, and assume that MHz (or GHz) is how much work a computer can get done: but it's more like RPM's in a car than MPH, it can mean the car is working harder, but not necessarily that you're going faster.
Programming.jpg
What's the difference between a programmer or a software engineer? Trick question. Since "Software Engineer" (or not "Computer Scientist") has more prestige, virtually all programmers (software developers) call themselves by fancier titles. But in truth, there used to be a difference... since the age of agile programming (rapid application development), there are scant few software engineers. Unless you're doing genetic algorithms (or Artificial Intelligence/Machine Learning) and running experiments on your code to see what it does, you're not really a Computer Scientist. But I'll cover what it used to mean.
RISCvCISC.png
During the 80s and 90s there was a Computer Chip design war about RISC or CISC. What does that mean, and which is better? For a while, Intel (and AMD) were able to spend more on design and delay the inevitable, but once mobility (with performance per watt) became important, ARM and RISC designs have taken over for x86's older CISC design.
ImagesRaster.jpeg
Sometimes you'll hear people say, "rasterized image" (often as opposed to vector images), but what exactly does that mean? Here's the very basics of pictures or rasterized images (and a little bit about compression).
There's a joke amongst programmers, usually towards their management, "if you don't leave me alone, I'll replace you with a very small shell script". The idea is that they're so simplistic and repetitive that a few lines of code could do their job: now go away.
SoftwareConsultants.jpeg

I worked over a decade as a consultant, and used and managed them for a couple decades more. As a consultant I've worked for organizations (agencies) and as an independent. I have nothing against consultants or consulting (they're a very valuable resource), but there is an art to using consultants or consulting organizations wisely, and most companies don't quite have the artistry required. This article will try to explain some of the pitfalls and ways to use consultants better.

SDLC.jpeg
There's a lot of variants of a Software Development Live Cycle. And the methodology has changed over the years... but mostly that about terminology or details, not the basics which seem to remain fairly constant, because physics, human nature, resource management, and the concepts behind writing software itself don't really change. So while companies (and some people) like to claim they're adherents to some methodology, if they are, they're idiots. The truth is every methodology adapts to the team/company, or that reflects the managements lack of a clue (or flexibility).
I'm both a big UNIX fan, and one of its detractors. UNIX is the old war-bird of Operating Systems -- which is ironic since it really isn't an Operating System any more -- but more on that later. UNIX was created as a private research project by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. Since AT&T wasn't really in the computer business they had the wise marketing plan of giving away the source code to UNIX for free. UNIX wallowed around and was basically only popular in education and research labs because it was inferior to other commercial OS's of the time. But since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it, and researchers came from academia so they used it too. This legacy has totally defined what UNIX is, and what it is good for -- and bad at. It wasn't good, it was free. But perfect is the enemy of good enough, and UNIX was always "Good enough" that people used it, added to it, and it sort of became a defecto solution until it got near universal adoption. These same pragmatic compromises are similar to why TCP/IP and HTML became the Internet.
WhySoBuggy.jpeg
Why are programs so buggy? They're not bugs, they're features... sorry, that's an old programmer joke. Everyone has problems with their programs (software), it crashes, stalls, or does unexpected things. People ask about these "bugs", why are there so many, and what can they do about it. Hopefully this helps you understand why.