Programming

From iGeek
Jump to: navigation, search



WhatsAntiAliasing.jpeg
WhatsAntiAliasing.jpeg
Anti-aliasing is a technique in computers for sacrificing clarity/contrast to give the illusion of better screen resolution.

What is Anti-Aliasing? It is using color to increase the percieved resolution on a display. It can't really alter resolution, but it can appear to do it, to the eye.

BASIC.png

Because I was a programmer, many people ask me, "How do I get started programming?" There are many choices, and it really depends on what you are trying to do. There is programming Applications, scripting, Web Programming, and so on. The bad news is that each of those choices will alter which language or tools you should choose -- and most people don't know this in advance. The good news is that when you get the concepts, many of them can follow from language to language and tool to tool. So the most important thing is to just show no fear, dive in, and start learning; some knowledge will be throw-away, but most you'll carry with you for years to come.

Gulliverstravels.jpg

What is Endian? How do you like your eggs? Big or little end up? If there are two equally valid ways to do something, then odds are that two different companies will chose to do those things differently. This is Murphy's law in action -- and it applied to different chip designers and how they ordered data in memory.

CountingComputerese.png

Counting in Computerese: The Magic of Binary, Octal and Hexadecimal. Computers deal in the mystical numbering systems, like Hexadecimal, Octal and Binary. People get concerned over it sounding complex, but they are really quite simple. If you can read this article, you should have a really good understanding of what they are, and how they work.

Sandwich.png
The other day, a friend and I were discussing the command-lines versus a GUI. His point, which I've heard a thousand times before and for 20+ years, is that unless you understand what's going on in the command line, then you don't understand what's going on in the computer. The idea being that the GUI abstracts you from what's really happening, and that some people can make command lines really fly; so they must be better overall. There's really a lot of different arguments and bias in there; so I have to break them down.
Database.png

What is a database? What are the kinds of databases? Why do you care? A Database is just a place that is used to so store and organize data (information). Your address book, a spreadsheet, basically anything that has a lot of similar (like) elements, is the basics of a database. In computers, they're used all over the place. And there are so many different ways of organizing that data, that we invent a whole lingo franca around database terminology. This tries to demystify some of the basic terms.

DigitizedSound.gif

Digitized Sound: understanding samples, rates and digital audio is really pretty simple. Sound is nothing but pressure waves traveling through the air, and hitting your ear -- which your brain decodes as sound (from noise or music). Computers have two basic ways of recreating sound, one is Synthesized Sound (make a waveform and tone that matches the original), and the other is to digitize the sound (sample the pressure wave very quickly, and then recreate it later). This is on how sampling is done.

Enterprise.jpg

Enterprise, Opensource or Commercial tools, which is better and why? Of course the answer is, "it depends": different tools are better for different things. Now I know that doesn't sound revolutionary, but that does seem to perplex some people. People don't understand the different tools or market segments they fit into, or what they are good for.

FUD.png

There is a computer term that you hear some geeks and industry insiders use, but many people new to computers don't know, but should. That term is FUD. FUD means "Fear, Uncertainty and Doubt", and it was the tool of large companies to scare users from using small companies software (or hardware). They'd so uncertainty, so customers would buy from the safest (largest) company, even if it wasn't currently the best software, or scare them into buying the biggest program, over features they might someday need (but only added complexity today).

ForwardCompatibility.png
In the tech world, I hear all the time about "Backwards" compatibility. That to me is like saying "I wish my hot new CD player would play my 8-track tapes as well". Backwards compatibility is when you create a new function or feature for your computer, but must also have a mode that works just like it did in the past. But this article is about learning to look forward,
FreeFeature.png

A free feature in software, is like a free lunch: there's no such thing as a free lunch. The value of something is directly related to how much it does what you need, and how much it doesn't try to do stuff you don't need. Most "free" features, are things that programmers had to add for something else, or are mostly implemented because of something else, so they figure, "hey, it's free" and just release it -- which makes it a distraction, a support cost, a potential bug magnet, and something at least a few customers will learn to use -- even if there's a better way to do it, thus giving you a legacy nightmare.

The idea of a "free feature" is proof that engineers shouldn't pretend to be product managers (and vise versa).
HackCrackPhreak.png
What are hackers, crackers and phreaks? This is the basics of how the terms evolved. They don't really mean what they meant when they were first used. So people date themselves with how they use the terms.
Hiring-programmers.jpg
Human Resources people, Managers, and general users, have no idea how simple or complex computer programming is. They think that they can just throw programmers around from one task to another, then some HR people select computer programmers based on language (Syntax), and not what really matters (skills and abilities). This would be like hiring an employee based on what school they attended and not what subjects they studied! This article will give some non-programmers a better idea of what Programming is about, and what they should be looking for when hiring programmers.
Hypercard.png
The History of Visual Basic is a bit of a history of early computers and Microsoft, how they borrowed other people's ideas and even implementation and then took credit.
Compression.jpeg
How does software Compression work? How do you make something smaller? In some ways you don't -- you just encode the data in a more efficient way. This article explains some basic concepts about data storage and compression: which are far simpler than people realize (it's just the implementation that can get some hairy math). Basically, compression is achieved by looking for commonality (in shape or pattern), and then saving that smaller description rather than saving the whole image. If you don't get it, read on, and it may become more clear.
Frequency.png
MHz or GHz, what does it mean? Many people ask questions about Megahertz (MHz) or Gigahertz (GHz). Worse, some people don't ask, and assume they know what those terms mean. It isn't hard to understand (it's just clock speed) - but it isn't as easy as some assume either. Many people get it wrong, and assume that MHz (or GHz) is how much work a computer can get done: but it's more like RPM's in a car than MPH, it can mean the car is working harder, but not necessarily that you're going faster.
Programming.jpg
What's the difference between a programmer or a software engineer? Trick question. Since "Software Engineer" (or not "Computer Scientist") has more prestige, virtually all programmers (software developers) call themselves by fancier titles. But in truth, there used to be a difference... since the age of agile programming (rapid application development), there are scant few software engineers. Unless you're doing genetic algorithms (or Artificial Intelligence/Machine Learning) and running experiments on your code to see what it does, you're not really a Computer Scientist. But I'll cover what it used to mean.
RISCvCISC.png
During the 80s and 90s there was a Computer Chip design war about RISC or CISC. What does that mean, and which is better? For a while, Intel (and AMD) were able to spend more on design and delay the inevitable, but once mobility (with performance per watt) became important, ARM and RISC designs have taken over for x86's older CISC design.
ImagesRaster.jpeg
Sometimes you'll hear people say, "rasterized image" (often as opposed to vector images), but what exactly does that mean? Here's the very basics of pictures or rasterized images (and a little bit about compression).
There's a joke amongst programmers, usually towards their management, "if you don't leave me alone, I'll replace you with a very small shell script". The idea is that they're so simplistic and repetitive that a few lines of code could do their job: now go away.
SoftwareConsultants.jpeg

I worked over a decade as a consultant, and used and managed them for a couple decades more. As a consultant I've worked for organizations (agencies) and as an independent. I have nothing against consultants or consulting (they're a very valuable resource), but there is an art to using consultants or consulting organizations wisely, and most companies don't quite have the artistry required. This article will try to explain some of the pitfalls and ways to use consultants better.

SDLC.jpeg
There's a lot of variants of a Software Development Live Cycle. And the methodology has changed over the years... but mostly that about terminology or details, not the basics which seem to remain fairly constant, because physics, human nature, resource management, and the concepts behind writing software itself don't really change. So while companies (and some people) like to claim they're adherents to some methodology, if they are, they're idiots. The truth is every methodology adapts to the team/company, or that reflects the managements lack of a clue (or flexibility).
SynthesizedSound.gif
Synthesized Sound is just making waves. Computers have two basic ways of recreating sound, one way is Digitized Sound (sample it, and then play it back later), the other is to synthesize it (make a waveform that approximates what you want) -- think of it like taking a picture versus sketching/drawing and painting. Synthesizing is the latter, creating pressure waves by algorithm, rather than recording it.
I'm both a big UNIX fan, and one of its detractors. UNIX is the old war-bird of Operating Systems -- which is ironic since it really isn't an Operating System any more -- but more on that later. UNIX was created as a private research project by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. Since AT&T wasn't really in the computer business they had the wise marketing plan of giving away the source code to UNIX for free. UNIX wallowed around and was basically only popular in education and research labs because it was inferior to other commercial OS's of the time. But since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it, and researchers came from academia so they used it too. This legacy has totally defined what UNIX is, and what it is good for -- and bad at. It wasn't good, it was free. But perfect is the enemy of good enough, and UNIX was always "Good enough" that people used it, added to it, and it sort of became a defecto solution until it got near universal adoption. These same pragmatic compromises are similar to why TCP/IP and HTML became the Internet.
MP3 logo.png
What is MP3? It's just a compressed file format used for sound. Video requires a lot of bandwidth, so they use fancy algorithms to compress it (make it as small as possible). Since video tracks also have sound with them, they did an exceptionally good job on sound as well. Thus, for doing sound compression we just use the video compression fromat's sound compression (Motion Picture Expert Group format for audio), called MPEG Audio Layer III, ala MP3.
WebApp.png

What is a Web Application, and how does it vary from a traditional website? There's a joke in tech, "there is no cloud: there's just somebody else's computer": in other words, you're either using your machine, or someone else's. A traditional website is just you browsing some files (in the HTML format) on someone else's computer. And a Web Application is for things more complex than just reading files: the other computer has to be running an Application (remotely) to serve up some of the stuff you're asking for: like if you need to enter forms and have that do something, do complex lookups (searches of files), or basically do anything more complex than read/write information but interact it with in a more complex way. That's what Web Apps are for.

WhySoBuggy.jpeg
Why are programs so buggy? They're not bugs, they're features... sorry, that's an old programmer joke. Everyone has problems with their programs (software), it crashes, stalls, or does unexpected things. People ask about these "bugs", why are there so many, and what can they do about it. Hopefully this helps you understand why.